Access the full text.
Sign up today, get DeepDyve free for 14 days.
(2003)
OpenEXR image file format
(2015)
IES Method for Evaluating Light Source Color Rendition
H. Barrow, J. Tenenbaum (1978)
RECOVERING INTRINSIC SCENE CHARACTERISTICS FROM IMAGES
Shida Beigpour, C. Riess, Joost Weijer, E. Angelopoulou (2014)
Multi-Illuminant Estimation With Conditional Random FieldsIEEE Transactions on Image Processing, 23
A. Gijsenij, Rui Lu, T. Gevers (2012)
Color Constancy for Multiple Light SourcesIEEE Transactions on Image Processing, 21
S. Bianco, C. Cusano, R. Schettini (2015)
Single and Multiple Illuminant Estimation Using Convolutional Neural NetworksIEEE Transactions on Image Processing, 26
(2013)
Spectral Simulation for Cultural Heritage
Xiangpeng Hao, B. Funt, Hanxiao Jiang (2019)
Evaluating Colour Constancy on the new MIST dataset of Multi-Illuminant Scenes
Indigo Renderer
J. Barron, Yun-Ta Tsai (2016)
Fast Fourier Color Constancy2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)
(1978)
LuxCoreRender . ” [ Online ]
N. Bruns (2020)
BlenderDer Unfallchirurg, 123
D. Nickerson (1940)
History of the Munsell Color System and Its Scientific ApplicationJournal of the Optical Society of America, 30
Re-processed Version of the Gehler Color Constancy Dataset of 568 Images
G. Finlayson, S. Hordley, P. Hubel (2001)
Color by Correlation: A Simple, Unifying Framework for Color ConstancyIEEE Trans. Pattern Anal. Mach. Intell., 23
(2013)
Colour Group (Great Britain) www.colour.org.uk
(1947)
Spectral Reflectance Properties of Natural Formations
K. Houser, Michael Royer, A. David (2015)
Evaluating Light Source Color Rendition using the IES TM-30-15 Method
Dongliang Cheng, Brian Price, Scott Cohen, M. Brown (2015)
Effective learning-based illuminant estimation using simple features2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)
B. Funt, Vlad Cardei, Kobus Barnard (1996)
Learning Color Constancy
Liwen Xu, B. Funt (2015)
How Multi-Illuminant Scenes Affect Automatic Colour Balancing
Shida Beigpour, Mai Ha, Sven Kunz, A. Kolb, V. Blanz (2016)
Multi-view Multi-illuminant Intrinsic Dataset
Nikola Banić, S. Lončarić (2017)
Unsupervised Learning for Color ConstancyArXiv, abs/1712.00436
Joost Weijer, T. Gevers, A. Gijsenij (2007)
Edge-Based Color ConstancyIEEE Transactions on Image Processing, 16
Ghalia Hemrit, G. Finlayson, A. Gijsenij, Peter Gehler, S. Bianco, M. Drew (2018)
Rehabilitating the Color Checker Dataset for Illuminant Estimation
Hamid Joze, M. Drew (2014)
Exemplar-Based Color Constancy and Multiple IlluminationIEEE Transactions on Pattern Analysis and Machine Intelligence, 36
(2009)
Technical introduction to OpenEXR
Shida Beigpour, Marc Serra, Joost Weijer, Robert Benavente, M. Vanrell, O. Penacchio, D. Samaras (2013)
Intrinsic image evaluation on synthetic complex scenes2013 IEEE International Conference on Image Processing
Dongliang Cheng, D. Prasad, Michael Brown (2014)
Illuminant estimation for color constancy: why spatial-domain methods work and the role of the color distribution.Journal of the Optical Society of America. A, Optics, image science, and vision, 31 5
G. Finlayson (2013)
Corrected-Moment Illuminant Estimation2013 IEEE International Conference on Computer Vision
M. Vrhel, R. Gershon, L. Iwan (1994)
Measurement and Analysis of Object Reflectance SpectraColor Research and Application, 19
F. Ciurea, B. Funt (2003)
A Large Image Database for Color Constancy Research
J. Parkkinen, T. Jaaskelainen, M. Kuittinen (1988)
Spectral representation of color images[1988 Proceedings] 9th International Conference on Pattern Recognition
B. Funt, Milan Mosny (2012)
Removing Outliers in Illumination Estimation
A new multi‐illuminant synthetic image test set called MIST is described and made publicly available. MIST is intended primarily for evaluating illumination estimation and color constancy methods, but additional data are provided to make it useful for other computer vision applications as well. MIST addresses the problem found in most existing real‐image datasets, which is that the groundtruth illumination is only measured at a very limited number of locations, despite the fact that illumination tends to vary significantly in almost all scenes. In contrast, MIST provides for each pixel: (a) the percent surface spectral reflectance, (b) the spectrum of the incident illumination, (c) the separate specular and diffuse components of the reflected light, and (d) the depth (ie, camera‐to‐surface distance). The dataset contains 900 stereo pairs, each of the 1800 images being a 30‐band multispectral image covering the visible spectrum from 400 to 695 nm at a 5 nm interval. Standard sRGB versions of the multispectral images are also provided. The images are synthesized by extending the Blender Cycles ray‐tracing renderer. The rendering is done in a way that ensures the images are not only photorealistic, but physically accurate as well.
Color Research & Application – Wiley
Published: Dec 1, 2020
Keywords: ; ; ;
Read and print from thousands of top scholarly journals.
Already have an account? Log in
Bookmark this article. You can see your Bookmarks on your DeepDyve Library.
To save an article, log in first, or sign up for a DeepDyve account if you don’t already have one.
Copy and paste the desired citation format or use the link below to download a file formatted for EndNote
Access the full text.
Sign up today, get DeepDyve free for 14 days.
All DeepDyve websites use cookies to improve your online experience. They were placed on your computer when you launched this website. You can change your cookie settings through your browser.