Announcement

Collapse
No announcement yet.

10 bit color

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • 10 bit color

    Origination
    until we get cameras that can record HEIF format we'll have to work from "raw", such as CR2 format. The HEIF standard appears to intend to implement 10 bit color. CR2 is, from what I read a 12 bit standard. See https://cpn.canon-europe.com/content...8_or_16_bit.do
    the 12 bit CR2 will work fine to support the 10 bit HEIF standard -- although -- it will be necessary to "shoot raw", upload raw, and then convert to HEIF/10-bit -- for cameras currently set up for the .jpeg standard

    Presentation
    Presentation involves a new set of issues: First off: a monitor will be needed that supports 1 billion colors rather then the 16 million that is common today and associated with 8 bit color. That's not all though: the graphics card that drives the display will need to have 10 bit color support. Last but not least the photographic software will need to support 10-bit color

    Spec Sheets
    anyway you cut the cookies here it's going to be necessary to get 10-bit capable components inservice "from the camera to the screen". this won't be trivial. where just about everything is setup for 8-bit color the equipment is more or less "plug and play" but as we move toward HEIF and 10-bit color we'll have to be on the lookout for traps and pitfalls. e.g. HDMI ports, also "Display" ports -- have been made in different versions and we'd need to be sure the port we plan to use supports the HEIF standard -- may be known at BT.2020 or Rec.2020 i.e. UHDTV
    Last edited by bill albert; 12-01-2019, 08:47 AM.

  • #2
    Here is a great site where you can learn about color image processing.
    https://www.cambridgeincolour.com

    Specifically, this may be of interest for this post
    https://www.cambridgeincolour.com/tu.../bit-depth.htm
    https://www.cambridgeincolour.com/co...t-printing.htm

    Comment


    • #3
      10-bit update
      on checking things I find that my RX570 graphics card and PA279Q monitor support 10-bit color ( 1B colors ). And, my Windows-10 system had implemented that as the standard setting.

      the next trick is to get something formatted in this mode. .TIFF format has supported 48 bit ( 3x16 ) since -- ?? so that might be an option. Working from Canon DPP I was able to export a .CR2 image into TIFF 48-bit format:
      Click image for larger version

Name:	TIFF-48bit.png
Views:	33
Size:	377.1 KB
ID:	51003

      the interesting question then is: does ACDSee display this as 10-bit depth -- or convert it to 8-bit ? how would I find out ?

      Comment


      • #4
        Select an image. Look in the status bar at the bottom. You will see xxxx x yyyy x bb that reflect the resolution and the bit depth. Don't forget, RAW and TIFF can include an alpha channel in addition to the colors. (Side note: Nikon RAW captures 12-bit or 14-bit images)

        Comment


        • #5
          Originally posted by GusPanella View Post
          Select an image. Look in the status bar at the bottom. You will see xxxx x yyyy x bb that reflect the resolution and the bit depth. Don't forget, RAW and TIFF can include an alpha channel in addition to the colors. (Side note: Nikon RAW captures 12-bit or 14-bit images)
          thanks!!
          this is an interesting topic to explore

          the Windows System ( 10 ) Display Info shows the monitor is in 10-bit mode. The RX570 card runs in full-screen mode, i.e. the entire screen is converted to 10-bit color . So when an image of 8-bit color is displayed -- either Windows -- ore more likely the graphics card -- has to convert it to 10-bit color depth

          which leads to an interesting question: if the image source is 10-bit -- or, as with .tiff : 16 bit -- how does the software feed that into the graphics system? If the graphics card is automatically converting the entire stream to 10-bit then everything sent to the screen will need to be 8-bit in order to display properly

          I may need a better graphics card: some are apparently capable of splitting the stream so as to show 8-bit or 10-bit color as requested by the sender..

          what I've noticed thus far is that my 16-bit .tiff image seems to have lost some brightness but I need to look into this a bit more to see if I did it during the conversion.

          I only have the base version of ACDSee -- so I use Canon DPP to process .CR2 ( Canon/raw) and that program provides for conversion to .tiff format.

          Amendment 1:

          This research is in response to the appearance recently of the HEIF/HEIC/HEIV system of "High Efficiency" formats -- which ( I think ) provide for 10-bit color depth.
          Last edited by bill albert; 12-13-2019, 03:18 AM. Reason: Amendment 1

          Comment


          • #6
            I know nothing about HEIF. Having a 10-bit (or what ever final image doesn't mean to much to me.

            I think of image bit depth as the amount of numerical working space to support the filter mathematics. Thus the advantage of working with RAW files over JPG.

            I use ColorSpace (ICC Profile) to think of what I could actually display iif supported by the monitor (or print). Certainly there is a link between the two, but I found it convenient to separate the two thoughts to develop an understanding of how things really work.

            Taking it one step further, I think of the "final format" as what my audience can see. My monitors have higher resolution and wider color space than most. However, most people are still limited to sRGB (sometime smaller). Thus, making the final image in a wider color space can lead to unexpected results when converted to the smaller sRGB. Thus it is important to get the final image into what most monitors and/or printers can use.

            Comment


            • #7
              thanks for a very thoughtful post

              apparently this HEIF/HEIV/HEIC system is being pushed by AAPL for their Smart Phone. From what I've read thus far the main idea seems to be to take advantage of today's faster chips in order to implement a higher level of data compression. This, to let folks store more photos, and particularly video on their phones -- and to cut the time required for uploading, for those syncing with some sort of "cloud" option.

              Also higher color depth is supported -- up to 16 bits. Advertisers are pushing "4K" video like they were trying to save their last barrel of whisky off a sinking ship -- and this will require higher resolution -- 3840x2160 -- as well as 10bit color depth

              The big question of course is: will anyone notice? or does the existing 8-bit sRGB already exceed the limits of our visual perception ? dunno ( but I think so ).

              People will notice the advertising and for those of us who may be involved in the photography business I think it will be important for us to know at least something about this -- how should I put it? State of the Art Technology? or "Latest fad"? or maybe just "New photo format"? dunno, but it does seem like a good idea to know at least a little about it.

              I use an ASUS PA279Q monitor with a Radeon RX570 graphics card........ Windows reports the monitor is in "10 bit" mode. I suspect everything is being sent from the computer software in 8-bit format and then automatically re-scaled to 10-but, probably by the RX570, and, if this is the case the effort thus far is likely of no affect. we'll see though,--

              Comment

              Working...
              X