@@ -236,45 +236,122 @@ Python
236236 print (cpu.applyRGB(imageData))
237237
238238
239- Displaying an image, using the GPU (Full Display Pipeline)
240- **********************************************************
239+ Displaying an image, using the CPU (Full Viewport Pipeline)
240+ ***********************************************************
241+
242+ This alternative version allows for a more complex viewing pipeline,
243+ allowing for all of the controls typically added to real-world viewport
244+ interfaces. For example, options are allowed to control which channels
245+ (red, green, blue, alpha, luma) are visible, as well as allowing for
246+ optional diagnostic adjustments (such as an exposure offset in scene linear).
247+
248+ #. **Get the Config **. In this example, use one of the built-in configs.
249+
250+ #. **Get the default display for this config and the display's default view. **
251+
252+ #. **Create a new DisplayViewTransform. ** This transform has the basic
253+ conversion from the reference space to the display but without the
254+ extras such as the channel swizzling and exposure control.
255+
256+ #. **Set up any diagnostic or creative look adjustments. ** If the user wants
257+ to specify a channel swizzle, a scene-linear exposure offset, an
258+ artistic look, this is the place to add it. See ociodisplay for an
259+ example. Note that although we provide recommendations for display,
260+ any transforms are allowed to be added into any of the slots. So if
261+ for your app you want to add 3 transforms into a particular slot
262+ (chained together), you are free to wrap them in a GroupTransform
263+ and set it accordingly!
264+
265+ #. **Create a new LegacyViewingPipeline. ** This transform will embody the
266+ full viewing pipeline you wish to control and will add all of the
267+ specified adjustments in the appropriate place in the pipeline,
268+ including performing any necessary color space conversions. For
269+ example, the LinearCC happens in the scene_linear role of the config
270+ and the colorTimingCC happens in the color_timing role color space.
271+
272+ #. **Get the Processor from the LegacyViewingPipeline. ** A CPUProcessor is
273+ then created from that to process pixels on the CPU.
274+
275+ #. **Convert your image, using the CPUProcessor. **
241276
242- This alternative version allows for a more complex viewing pipeline, allowing
243- for all of the controls typically added to real-world viewer interfaces. For
244- example, options are allowed to control which channels (red, green, blue,
245- alpha, luma) are visible, as well as allowing for optional color corrections
246- (such as an exposure offset in scene linear).
277+ Python
278+ ++++++
247279
248- #. **Get the Config. **
249- See :ref: `usage_applybasic ` for details.
250- #. **Lookup the display ColorSpace. **
251- See :ref: `usage_displayimage ` for details
252- #. **Create a new DisplayViewTransform. **
253- This transform has the basic conversion from the reference space to the
254- display but without the extras such as the channel swizzling and exposure
255- control.
256- The user is required to call
257- :cpp:func: `DisplayViewTransform::setSrc ` to set the input
258- ColorSpace, as well as
259- :cpp:func: `DisplayViewTransform::setDisplay ` and.
260- :cpp:func: `DisplayViewTransform::setView `
261- #. **Create a new LegacyViewingPipeline. **
262- This transform will embody the full 'display' pipeline you wish to control.
263- The user is required to call
264- :cpp:func: `LegacyViewingPipeline::setDisplayViewTransform ` to set the
265- DisplayViewTransform.
266- #. **Set any additional LegacyViewingPipeline options. **
267- If the user wants to specify a channel swizzle, a scene-linear exposure
268- offset, an artistic look, this is the place to add it. See ociodisplay for an
269- example. Note that although we provide recommendations for display, any
270- transforms are allowed to be added into any of the slots. So if for your app
271- you want to add 3 transforms into a particular slot (chained together), you
272- are free to wrap them in a :cpp:class: `GroupTransform ` and set it
273- accordingly!
274- #. **Get the processor from the LegacyViewingPipeline. **
275- The processor is then queried from the LegacyViewingPipeline.
276- #. **Convert your image, using the processor. **
277- See :ref: `usage_applybasic ` for details for using the CPU.
280+ .. code-block :: python
281+
282+ import PyOpenColorIO as ocio
283+
284+ # Set up some example input variables to simulate a diagnostic
285+ # adjustment a user might make using viewport controls to analyze
286+ # different parts of the image tone scale.
287+ exposure_val = 1.2 # +1.2 stops exposure adjustment
288+ gamma_val = 0.8 # adjust diagnostic gamma to 0.8
289+
290+ # Step 1: Use one of the built-in configs.
291+ config = ocio.Config.CreateFromBuiltinConfig(" studio-config-latest" )
292+
293+ # Step 2: Get the default display and view.
294+ display = config.getDefaultDisplay()
295+ view = config.getDefaultView(display)
296+
297+ # Step 3: Create a DisplayViewTransform to convert from the scene-linear
298+ # role to the selected display & view.
299+ display_view_tr = ocio.DisplayViewTransform(
300+ src = ocio.ROLE_SCENE_LINEAR ,
301+ display = display,
302+ view = view
303+ )
304+
305+ # Step 4: Set up any diagnostic or creative look adjustments.
306+
307+ # Create an ExposureContrastTransform to apply an exposure adjustment
308+ # in the scene-linear input space. By setting the dynamic property to true,
309+ # that allows for interactive adjustment without rebuilding the processor.
310+ exposure_tr = ocio.ExposureContrastTransform(
311+ exposure = exposure_val,
312+ dynamicExposure = True
313+ )
314+
315+ # Add a Channel view 'swizzle'.
316+ channelHot = (1 , 1 , 1 , 1 ) # show rgb
317+ # channelHot = (1, 0, 0, 0) # show red
318+ # channelHot = (0, 0, 0, 1) # show alpha
319+ # channelHot = (1, 1, 1, 0) # show luma
320+ channel_view_tr = ocio.MatrixTransform.View(
321+ channelHot = channelHot,
322+ lumaCoef = config.getDefaultLumaCoefs()
323+ )
324+
325+ # Add a second ExposureContrastTransform, this one applying an gamma
326+ # adjustment in the output display space (useful for checking shadow
327+ # detail).
328+ gamma_tr = ocio.ExposureContrastTransform(
329+ gamma = gamma_val,
330+ pivot = 1.0 ,
331+ dynamicGamma = True
332+ )
333+
334+ # Step 5: Create a LegacyViewingPipeline which builds a processing pipeline
335+ # by adding the various diagnostic controls around the DisplayViewTransform.
336+ viewing_pipeline = ocio.LegacyViewingPipeline()
337+ viewing_pipeline.setLinearCC(exposure_tr)
338+ viewing_pipeline.setChannelView(channel_view_tr)
339+ viewing_pipeline.setDisplayViewTransform(display_view_tr)
340+ viewing_pipeline.setDisplayCC(gamma_tr)
341+
342+ # Step 6: Create a Processor and CPUProcessor from the pipeline.
343+ proc = viewing_pipeline.getProcessor(config)
344+ # Use the default optimization level to create a CPU Processor.
345+ cpu_proc = proc.getDefaultCPUProcessor()
346+
347+ # Step 7: Evaluate an image pixel value.
348+ image_pixel = [0.5 , 0.4 , 0.3 ] # a test value to process
349+ rgb = cpu_proc.applyRGB(image_pixel)
350+ print (rgb)
351+
352+
353+ Displaying an image, using the GPU
354+ **********************************
278355
279356Applying OpenColorIO's color processing using the GPU is very customizable
280357and an example helper class is provided for use with OpenGL.
0 commit comments