Syllabus
The Digital Photography Decal is a technology-based photography course. Rather than dealing with any of the aesthetic
aspects of photographs however, this class will look at the technology
underlying photography, discussing how camera systems work to produce images. We’ll also explore the frontiers of
photography, working with difficult photographic situations, learning our
cameras’ technical limitations, and what we can do as photographers to work
around or compensate for them.
The class will concentrate heavily on the workings of the digital imaging
sensor, and will also cover optical systems and image editing through
software. The class will focus more on
digital camera systems – however, many topics covered in the course will be
common to both digital and film photography.
The class is aimed at any students interested in photography. Some experience with basic photography
principles will be very helpful(students should know about aperture, shutter
speed, and what an “exposure stop” is), although accommodations can be made for
interested students who don’t have any experience.
Core Curriculum
The course will consist of three core units, which will collectively
cover the complete function of a camera, from light into image, and explain and
find technical solutions to some of the most difficult problems and situations
encountered by photographers today.
Students will be expected to learn about the following subjects, including a
comprehensive background of how the underlying technology works, and techniques
to employ when facing different difficult photographic situations.
The class will meet weekly to discuss a photographic technology-related topic,
which will include presentations, lectures, class discussion, and hands-on
demonstrations and exercises. In
addition, there will be weekly field assignments where students will take
photos, concentrating on or experimenting with a specific photographic topic. Part of the subsequent class will be devoted
towards looking at students’ pictures and analyzing them.
Anatomy of a digital imaging sensor
Low light photography
-Camera shake (causes, and how to stabilize)
--Faster shutter speed
---Underexposure
---Higher ISO
----Noise issues
----Lower-noise sensor design
----Noise reduction techniques
---Larger aperture
--Image stabilization (CCD, lens, software)
--Traditional photography techniques
-Motion blur – causes, and how to correct
--Camera shake methods (which ones apply, which don’t)
--Flash (stop-action)
Exposure/Dynamic range
-Dynamic, tonal range limitations
--Ways to increase dynamic/tonal range
---Pulling shadows
---Preserving highlights
---HDR
---Curves
---Bit depth
Focus/Sharpness
-Focusing system (pinhole camera, lenses)
--Pinhole camera
--Lenses
---Aperture
---Focal length
-Depth of field
-Autofocus mechanisms
-Optical aberration, lens design
-Sharpening (stopping down, software)
-Demosaicing/Interpolation
--Color filtration
---Bayer filter
---Alternative methods
Course Schedule
Unit 1: Low Light Photography
Week 1:
Start with the anatomy of sensor.
Conversion of light (photons) into photoelectrons which are processed into
pixel info
Subdivision of the sensor into pixels
Photowell size/depth
Exposure: how aperture, shutter speed, and ISO play into this.
Assignment: Next week we’ll discuss low-light, so take lots of low-light
pictures, and make note of different effects you notice in your images. Contrast these to how you would have expected
a normal image to look like, if it were taken outside in the middle of the day
with plenty of light, and also contrast these effects with how you want your image to look like. While you’re thinking about the things going
on in your images, think back to the model of the sensor we discussed, and what
might be going on inside of the camera to produce these effects.
Week 2:
Effects and differences of low light
[Look at the students’ images, ask what effects they noticed]
Camera shake: What causes it (in class demonstration, live feed from a camera)?
How do we correct it?
Traditional methods: Stabilizing the camera
Hold up to face (viewfinder vs. rear-display LCD)
Prop elbows
Tripod/Monopod
Multiple shots, and hope for the most stabilized version
Assignment: Try out these traditional methods: low-light shooting again!
Week 3:
[Look at last week’s photos. What
methods worked, and what didn’t?]
Other image stabilization methods – lens-shift, sensor-shift, software
recombination
Wider focal lengths + cropping?
Down-rezzing the image
Flash?
Assignment: Attempt cropping vs. down-resolution. Also, attempt a recombination of a burst of
underexposed shots (software recombination stabilization). If students have lens or sensor stabilization
cameras, ask them to run tests to determine how effective these methods are.
Week 4:
[Look at last week’s photos. Which
of our technological methods worked?
Which didn’t? Compared to actual
photographic techniques, how effective is image stabilization technology?]
We’ve explored one method of combating slow-shutter speed problems –
stabilizing a camera, or camera element, that will allow us to have a
consistent light feed. The other option
is to simply increase the shutter speed, minimizing the results of any
shake. We do this by adjusting the other
two exposure parameters: ISO and Aperture.
Larger aperture
Higher sensitivity
Underexposure + Exposure Compensation
Assignment: Experiment with these three methods
Week 5:
[How effective was increasing the shutter speed in eliminating our
slow-shutter problems? How did it
compare to the photographic techniques discussed in Week 2, or the IS
technology in Week 3? Were there any
other side effects to our three methods?]
As you may have noticed, increasing to a larger aperture narrows our depth of
field, increasing sensitivity also tends to increase noise (note the differences
between a natively higher ISO vs. Underexposure + EC), and exposure
compensation tends to increase noise and clip shadows. We’ll discuss depth of field later on with
lenses, and clipping later on in exposure, but for now we will concentrate on
the noise aspect of sensitivity and exposure compensation.
What produces noise? Return to model of
the image sensor.
-Random occurrence of photons (poisson distribution)
-Dark current
How can we combat noise?
-Sensor designs to combat this (MP/pixel pitch, sensor size)
-Correcting noise in post-processing
--Dark frame subtraction
--Neat-image
--Darkening image
--Blur
Assignment: Two assignments are planned for this week (possibly split over two
weeks?):
Students with different types of camera (different sensor size, different
number of pixels) should pair together, and take images at different ISO
settings (and if possible, different noise reduction settings). How do the characteristics of the noise
differ between cameras with large/small sensors, high/low megapixels? Is there a noticeable difference between
specific camera manufacturers?
After assignment 1, students can either use their ISO test pictures, images
from Week 4, or shoot more photos on their own.
Students will then work to clean up the noisy images. Students should
describe their methods, step by step, in enough detail so that someone else
could reproduce the same results.
Week 6:
[Review the results of Week 5 Assignment 1.
Did the noise trends in sensor size and the number of pixels correlate
with our model of the sensor? Were there
any trends noticed amongst different camera manufacturers (possibly this
indicates a poor-performing sensor, or a different approach to in-camera noise
processing?).
Demonstrate before/after images for noise processing. A few students will present their techniques]
Another common type of blurring isn’t due to the camera, but due to subject
motion. Will our methods to shoot static
low-light work the same as for action low-light?
Assignment: Go out there and shoot action (possibly assign each photographer to
shoot a sports game?). Employ our
various photographic or technological IS methods, as well as increasing shutter
speed, and possibly even flash.
Week 7:
[Which of our methods worked, and which didn’t?]
Fun class: Field trip, a night photoshoot around campus taking pictures,
employing various skills we’ve learned.
Assignment: “Midterm Project” for Unit 1.
Choose a very challenging low-light subject (possibly from night
photoshoot), and prepare best low-light shot to present next class. Next class, present photo, detailing both
techniques used to take the photo, and techniques used in post-processing.
Unit 2: Exposure/Dynamic Range
Week 8:
[Students’ presentations from Week 7 “Midterm Project”]
What is dynamic range?
What is tonal range?
Why is it desirable to have large ranges of both?
Discuss exposure, and the limitations of dynamic range and tonal range, in
terms of the sensor
The photo well depth, the analog-to-digital conversion bit depth
Innovative Approach: Fujifilm’s SR sensor
Assignment: Shoot in high contrast environments, and shoot them at different
ISOs, as well as different levels of overexposure and underexposure, to see
what the limitations are in terms of dynamic and tonal range. Also, try playing around with in-camera image
parameters, such as “contrast” or “saturation”, or even the various “picture
modes” some cameras include. For
students with cameras capable of RAW, compare 16- or 12-bit RAW images against
regular 8-bit JPG.
Week 9:
[Review photos, noting
various dynamic and tonal range limits, and comparisons between RAW/JPG, camera
parameters, types of cameras (how does a professional-level SLR compare to a
consumer-level ‘point and shoot’? Is this some factor of the sensor design, or
merely different image-processing philosophies?
Although we might have thought the high contrast photos were hopelessly lost,
through software such as Shadows/Highlights we discover that we actually have a
lot more light information than we thought, especially in the shadow areas, and
that we can retrieve a marginal amount of detail in the highlights, as well,
especially if we start from our RAW light information. Demonstrate levels, shadows and highlights,
HDR bracketing, and curves.
Assignment: Maybe take more photos, but work with an image and see just how
much dynamic and tonal range you can get.
How far can you raise the shadows, or burn in highlights without looking
unnatural? How should you expose, if you
plan to restore as much dynamic range as possible later on in
post-processing? Students will prepare a
before/after image which maximizes the visible dynamic and tonal range.
Unit 3: Focus/Sharpness
Week 10:
[Present students’ images from Week 9]
We’ve talked a lot about various aspects of exposure, and how the sensor
converts light information into pixels, but we haven’t really talked about how
the camera formulates and concentrates that light onto the sensor. We showed a rudimentary example in the
beginning with the original pinhole design, but since then we’ve developed
lenses that give us better control of that light the comes in. Diagrams of how lenses can change light
trajectories, and explain exactly what “in/out focus” is (circle of confusion).
Possibly an in-class lab with lenses?
(Like a Physics 7C refraction lab?)
Assignment: To be determined
Week 11: A closer examination of
lens systems, and possibly an even closer look at lens designs? In any case, an explanation of how focal
lengths define the field of view, and how focal lengths, apertures, and subject
distances define the depth of field.
Make sure to note the crop factor of certain cameras, for students to
keep in mind during assignment.
Assignment: Controlled test: Shoot at different apertures, different focal
lengths, different distance to subjects.
How does varying each parameter affect the image?
Week 12:
[Discuss results from last
assignment, why things came out as they did, in terms of the different
parameters]
Open Discussion, with a little bit of aesthetic discussion:
Maintaining same subject magnification, differences between changing subject
distance and changing focal length
Distortion as a result of subject distance
Differences in depth of field behind and in front of focal distance, what
causes this?
Effect of sensor size on depth of field
Ways to manipulate depth of field:
-lenses that shift the proportional depth of field in front of/behind focal
distance
-selective blurring to mimic shallower depth of field
-sharpening (Unsharp mask), in brief detail (indiscriminate sharpening of
entire image)
-image stacking through multiple images to expand depth of field
Assignment: Attempt to manipulate depth of field, through selective blurring,
sharpening and image stacking. Detail
techniques step-by-step, and explain in next class.
Week 13:
[Students present their
depth-of-field manipulated images, and detail their techniques]
We know how we define focus, and by looking at the images ourselves, we
know what is “in focus” and what is “out of focus”. But for the camera, all it sees is
indiscriminate light splashed onto the surface of the sensor - by itself, the
camera has no way of knowing if something is in focus or not. How then, do autofocus systems work?
Two types:
Passive phase/contrast detection
Active light beam
-Limitations of each method
Assignment: To be determined
Week 14:
Imperfect lenses
We have so far covered the basic properties and effects of lenses, and from
what we’ve learned so far, they seem pretty simple. We have lenses of certain focal lengths and
which represent certain apertures, and we might have a moving element that
keeps everything in focus. Why then, are
real-life lenses (at least the expensive, high-end ones) such huge and complex
beasts?
The answer is that lenses aren’t perfect, and have varying levels of distortion
and dispersion. Different types of
aberration
-Monochromatic aberration (distortion and defocus)
-Chromatic aberrations (color dispersion)
What problems they cause in real-world images
-Curvature of focus plane, causing a non-flat focus plane
-Distortion (barrel and pincushion)
-Diffraction
-Color fringing (disalignment of colors)
Ways to fix:
-Lens designs: apochromatic lenses, achromatic lenses, diffractive optics (DO)
-Software correction for distortion in images
-Software Removal of fringing
Assignment: Attempt to expose imperfections in lenses – in what kind of
photographic situations (and what kind of lens settings/parameters) are they
most noticeable? What kind of settings, or in what kind of situations, can
these imperfections be minimized. Also,
use software methods to correct images with distortion and images with color
fringing, again detailing technique step-by-step.
Week 15:
[Review and discuss images from previous week, noting what kind of situations
and settings seem to exacerbate or minimize lens imperfections. Also present software corrections of
aberration in images.]
This entire time, we have left out one extremely important part of how the
sensor works. In our original model, we
showed that the sensor would take the energy from photons and convert them into
an electrical signal, which would denote the pixel’s brightness. However, we never explained how the sensor
determines what hue of color that pixel should be. The truth is, our conventional sensor is
unable to discern that!
Sensors only interpret luminance values (they only render black and white)
In most digital imaging devices, color information is interpreted through color
filtration
Most common method:
Bayer filter
-Multiple drawbacks: loss of light information, need to interpolate/demosaic
Alternative methods, benefits, various drawbacks:
-Variations on Bayer Filter
-Multi-sensor (two-sensor for luma, chroma, or three-sensor for RGB)
-Multi-shot
-Linear scan
-Foveon
Necessary Materials
The only equipment required for this class is a digital camera of some
sort. Even a cell phone or PDA camera
will work for most assignments, so long as the images can be transferred onto
the computer. However, the more capable
of a camera, the more you will be able to learn. A tripod may also be useful.
You may use a film camera, although you would need to scan the negatives in as
digital images for the class (a film scanner is available). Given that the assignments are due weekly,
this is probably impractical.
Adobe Photoshop will be used extensively.
However, other similar image editing applications will work fine, and
free image editing software resources are available.
Minimum Camera Feature Requirements:
Ability to retrieve image files onto computer
2MP images (less is possible, but makes software exercises difficult)
Useful Camera Features for this class:
Manual control over ISO, Aperture, and Shutter Speed
Large range for aperture (<f2.8, >f11), ISO settings (<ISO200,>ISO800)
Large range in focal lengths (large optical “zoom”)
Image stabilization technology
Ability to capture images in RAW format
Grading
Students will be evaluated on class participation, which will include
discussing photographic topics, taking photographs for weekly assignments, and
analyzing peers’ images. Students will
be permitted to miss no more than 3 classes, and will still be required to
complete all of the weekly photographic assignments. The course is graded on a Pass/No Pass basis.