Intelligent Imaging Processing

Table Of Contents:

    Preface.

    1 Humanistic Intelligence as a Basis for Intelligent Image Processing.

    1.1 Humanistic Intelligence/

    1.1.1 Why Humanistic Intelligence.

    1.1.2 Humanistic Intelligence Does Not Necessarily Mean "User-Friendly".

    1.2 "WearComp" as Means of Realizing Humanistic Intelligence.

    1.2.1 Basic Principles of WearComp.

    1.2.2 The Six Basic Signal Flow Paths of WearComp.

    1.2.3 Affordances and Capabilities of a WearComp-Based Personal Imaging System.

    1.3 Practical Embodiments of Humanistic Intelligence.

    1.3.1 Building Signal-Processing Devices Directly Into Fabric.

    1.3.2 Multidimensional Signal Input for Humanistic Intelligence.

    2 Where on the Body is the Best Place for a Personal Imaging System?

    2.1 Portable Imaging Systems.

    2.2 Personal Handheld Systems.

    2.3 Concomitant Cover Activities and the Videoclips Camera System.

    2.3.1 Rationale for Incidentalist Imaging Systems with Concomitant Cover Activity.

    2.3.2 Incidentalist Imaging Systems with Concomitant Cover Activity.

    2.3.3 Applications of Concomitant Cover Activity and Incidentalist Imaging.

    2.4 The Wristwatch Videophone: A Fully Functional "Always Ready" Prototype.

    2.5 Telepointer: Wearable Hands-Free Completely Self-Contained Visual Augmented Reality.

    2.5.1 No Need for Headwear or Eyewear If Only Augmenting.

    2.5.2 Computer-Supported Collaborative Living (CSCL).

    2.6 Portable Personal Pulse Doppler Radar Vision System Based on Time-Frequency Analysis and q-Chirplet Transform.

    2.6.1 Radar Vision: Background, Previous Work.

    2.6.2 Apparatus, Method, and Experiments.

    2.7 When Both Camera and Display are Headworn: Personal Imaging and Mediated Reality.

    2.7.1 Some Simple Illustrative Examples.

    2.7.2 Mediated Reality.

    2.7.3 Historical Background Leading to the Invention of the Reality Mediator.

    2.7.4 Practical Use of Mediated Reality.

    2.7.5 Personal Imaging as a Tool for Photojournalists and Reporters.

    2.7.6 Practical Implementations of the RM.

    2.7.7 Mediated Presence.

    2.7.8 Video Mediation.

    2.7.9 The Reconfigured Eyes.

    2.8 Partially Mediated Reality.

    2.8.1 Monocular Mediation.

    2.9 Seeing "Eye-to-Eye".

    2.10 Exercises, Problem Sets, and Homework.

    2.10.1 Viewfinders.

    2.10.2 Viewfinders Inside Sunglasses.

    2.10.3 Mediated Reality.

    2.10.4 Visual Vicarious Documentary.

    2.10.5 Aremac Field of View.

    2.10.6 Matching Camera and Aremac.

    2.10.7 Finding the Right Camera.

    2.10.8 Testing the Camera.

    3 The EyeTap Principle: Effectively Locating the Camera Inside the Eye as an Alternative to Wearable Camera Systems.

    3.1 A Personal Imaging System for Lifelong Video Capture.

    3.2 The EyeTap Principle.

    3.2.1 "Lightspace Glasses".

    3.3 Practical Embodiments of EyeTap.

    3.3.1 Practical Embodiments of the Invention.

    3.3.2 Importance of the Collinearity Criterion.

    3.3.3 Exact Identity Mapping: The Orthoscopic Reality Mediator.

    3.3.4 Exact Identity Mapping Over a Variety of Depth Planes.

    3.4 Problems with Previously Known Camera Viewfinders.

    3.5 The Aremac.

    3.5.1 The Focus-Tracking Aremac.

    3.5.2 The Aperture Stop Aremac.

    3.5.3 The Pinhole Aremac.

    3.5.4 The Diverter Constancy Phenomenon.

    3.6 The Foveated Personal Imaging System.

    3.7 Teaching the EyeTap Principle.

    3.7.1 Calculating the Size and Shape of the Diverter.

    3.8 Calibration of EyeTap Systems.

    3.9 Using the Device as a Reality Mediator.

    3.10 User Studies.

    3.11 Summary and Conclusions.

    3.12 Exercises, Problem Sets, and Homework.

    3.12.1 Diverter Embodiment of EyeTap.

    3.12.2 Calculating the Size of the Diverter.

    3.12.3 Diverter Size.

    3.12.4 Shape of Diverter.

    3.12.5 Compensating for Slight Aremac Camera Mismatch.

    4 Comparametric Equations, Quantigraphic Image Processing, and Comparagraphic Rendering.

    4.1 Historical Background.

    4.2 The Wyckoff Principle and the Range of Light.

    4.2.1 What's Good for the Domain Is Good for the Range.

    4.2.2 Extending Dynamic Range and Improvement of Range Resolution by Combining Differently Exposed Pictures of the Same Subject Matter.

    4.2.3 The Photoquantigraphic Quantity, q.

    4.2.4 The Camera as an Array of Light Meters.

    4.2.5 The Accidentally Discovered Compander.

    4.2.6 Why Stockham Was Wrong.

    4.2.7 On the Value of Doing the Exact Opposite of What Stockham Advocated.

    4.2.8 Using Differently Exposed Pictures of the Same Subject Matter to Get a Better Estimate of q.

    4.2.9 Exposure Interpolation and Extrapolation.

    4.3 Comparametric Image Processing: Comparing Differently Exposed Images of the Same Subject Matter.

    4.3.1 Misconceptions about Gamma Correction: Why Gamma Correction Is the Wrong Thing to Do!

    4.3.2 Comparametric Plots and Comparametric Equations.

    4.3.3 Zeta Correction of Images.

    4.3.4 Quadratic Approximation to Response Function.

    4.3.5 Practical Example: Verifying Comparametric Analysis.

    4.3.6 Inverse Quadratic Approximation to Response Function and its Squadratic Comparametric Equation.

    4.3.7 Sqrtic Fit to the Function f (q).

    4.3.8 Example Showing How to Solve a Comparametric Equation: The Affine Comparametric Equation and Affine Correction of Images.

    4.3.9 Power of Root over Root Plus Constant Correction of Images.

    4.3.10 Saturated Power of Root over Root Plus Constant Correction of Images.

    4.3.11 Some Solutions to Some Comparametric Equations That Are Particularly Illustrative or Useful.

    4.3.12 Properties of Comparametric Equations.

    4.4 The Comparagram: Practical Implementations of Comparanalysis.

    4.4.1 Comparing Two Images That Differ Only in Exposure.

    4.4.2 The Comparagram.

    4.4.3 Understanding the Comparagram.

    4.4.4 Recovering the Response Function from the Comparagram.

    4.4.5 Comparametric Regression and the Comparagram.

    4.4.6 Comparametric Regression to a Straight Line.

    4.4.7 Comparametric Regression to the Exponent over Inverse Exponent of Exponent Plus Constant Model.

    4.5 Spatiotonal Photoquantigraphic Filters.

    4.5.1 Spatiotonal Processing of Photoquantities.

    4.6 Glossary of Functions.

    4.7 Exercises, Problem Sets, and Homework.

    4.7.1 Parametric Plots.

    4.7.2 Comparaplots and Processing "Virtual Light".

    4.7.3 A Simple Exercise in Comparametric Plots.

    4.7.4 A Simple Example with Actual Pictures.

    4.7.5 Unconstrained Comparafit.

    4.7.6 Weakly Constrained Comparafit.

    4.7.7 Properly Constrained Comparafit.

    4.7.8 Combining Differently Exposed Images.

    4.7.9 Certainty Functions.

    4.7.10 Preprocessing (Blurring the Certainty Functions) and Postprocessing.

    5 Lightspace and Antihomomorphic Vector Spaces.

    5.1 Lightspace.

    5.2 The Lightspace Analysis Function.

    5.2.1 The Spot-Flash-Spectrometer.

    5.3 The "Spotflash" Primitive.

    5.3.1 Building a Conceptual Lighting Toolbox: Using the Spotflash to Synthesize Other Light Sources.

    5.4 LAFWLSF Imaging ("Lightspace").

    5.4.1 Upper-Triangular Nature of Lightspace along Two Dimensions: Fluorescent and Phosphorescent Objects.

    5.5 Lightspace Subspaces. 5.6 "Lightvector" Subspace.

    5.6.1 One-Dimensional Lightvector Subspace.

    5.6.2 Lightvector Interpolation and Extrapolation.

    5.6.3 Processing Differently Illuminated Wyckoff Sets of the Same Subject Matter.

    5.6.4 "Practical" Example: 2-D Lightvector Subspace.

    5.7 Painting with Lightvectors: Photographic/Videographic Origins and Applications of WearComp-Based Mediated Reality.

    5.7.1 Photographic Origins of Wearable Computing and Augmented/Mediated Reality in the 1970s and 1980s.

    5.7.2 Lightvector Amplification.

    5.7.3 Lightstrokes and Lightvectors.

    5.7.4 Other Practical Issues of Painting with Lightvectors.

    5.7.5 Computer-Supported Collaborative Art (CSCA).

    5.8 Collaborative Mediated Reality Field Trials.

    5.8.1 Lightpaintball.

    5.8.2 Reality-Based EyeTap Video Games.

    5.9 Conclusions.

    5.10 Exercises, Problem Sets, and Homework.

    5.10.1 Photoquantigraphic Image Processing.

    5.10.2 Lightspace Processing.

    5.10.3 Varying the Weights.

    5.10.4 Linearly Adding Lightvectors is the Wrong Thing to Do.

    5.10.5 Photoquantigraphically Adding Lightvectors.

    5.10.6 CEMENT.

    6 VideoOrbits: The Projective Geometry Renaissance.

    6.1 VideoOrbits.

    6.2 Background.

    6.2.1 Coordinate Transformations.

    6.2.2 Camera Motion: Common Assumptions and Terminology.

    6.2.3 Orbits.

    6.2.4 VideoOrbits.

    6.3 Framework: Motion Parameter Estimation and Optical Flow.

    6.3.1 Feature-Based Methods.

    6.3.2 Featureless Methods Based on Generalized Cross-correlation.

    6.3.3 Featureless Methods Based on Spatiotemporal Derivatives.

    6.4 Multiscale Implementations in 2-D.

    6.4.1 Unweighted Projective Flow.

    6.4.2 Multiscale Repetitive Implementation.

    6.4.3 VideoOrbits Head-Tracker.

    6.4.4 Exploiting Commutativity for Parameter Estimation.

    6.5 Performance and Applications.

    6.5.1 Subcomposites and the Support Matrix.

    6.5.2 Flat Subject Matter and Alternate Coordinates.

    6.6 AGC and the Range of Light.

    6.6.1 Overview.

    6.6.2 Turning AGC from a Bug into a Feature.

    6.6.3 AGC as Generator of Wyckoff Set.

    6.6.4 Ideal Spotmeter.

    6.6.5 AGC.

    6.7 Joint Estimation of Both Domain and Range Coordinate Transformations.

    6.8 The Big Picture.

    6.8.1 Paper and the Range of Light.

    6.8.2 An Extreme Example with Spatiotonal Processing of Photoquantities.

    6.9 Reality Window Manager.

    6.9.1 VideoOrbits Head-Tracker.

    6.9.2 A Simple Example of RWM.

    6.9.3 The Wearable Face Recognizer as an Example of a Reality User Interface.

    6.10 Application of Orbits: The Photonic Firewall.

    6.11 All the World's a Skinner Box.

    6.12 Blocking Spam with a Photonic Filter.

    6.12.1 Preventing Theft of Personal Solitude by Putting Shades on the Window to the Soul.

    6.13 Exercises, Problem Sets, and Homework.

    6.13.1 The VideoOrbits Head-Tracker.

    6.13.2 Geometric Interpretation of the Three-Parameter Model.

    6.13.3 Shooting Orbits.

    6.13.4 Photoquantigraphic Image Composite (PIC).

    6.13.5 Bonus Question.

    Appendix A: Safety First!

    Appendix B: Multiambic Keyer for Use While Engaged in Other Activities.

    B.1 Introduction.

    B.2 Background and Terminology on Keyers.

    B.3 Optimal Keyer Design: The Conformal Keyer.

    B.4 The Seven Stages of a Keypress.

    B.5 The Pentakeyer.

    B.6 Redundancy.

    B.7 Ordinally Conditional Modifiers.

    B.8 Rollover.

    B.8.1 Example of Rollover on a Cybernetic Keyer.

    B.9 Further Increasing the Chordic Redundancy Factor: A More Expressive Keyer.

    B.10 Including One Time Constant.

    B.11 Making a Conformal Multiambic Keyer.

    B.12 Comparison to Related Work.

    B.13 Conclusion.

    B.14 Acknowledgments.

    Appendix C: WearCam GNUX Howto.

    C.1 Installing GNUX on WearComps.

    C.1.1 GNUX on WearCam.

    C.2 Getting Started.

    C.3 Stop the Virus from Running.

    C.4 Making Room for an Operating System.

    C.5 Other Needed Files.

    C.6 Defrag / 323

    C.7 Fips.

    C.8 Starting Up in GNUX with Ramdisk.

    C.8.1 When You Run install.bat.

    C.8.2 Assignment Question.

    Appendix D: How to Build a Covert Computer Imaging System into Ordinary Looking Sunglasses.

    D.1 The Move from Sixth-Generation WearComp to Seventh-Generation.

    D.2 Label the Wires!

    D.3 Soldering Wires Directly to the Kopin CyberDisplay.

    D.4 Completing the Computershades.

    Bibliography.

    Index.