This is really cool and very clever. But i want to raise one thing.
> designed a special color reference chart that can be printed on a card
My rudimentary understanding of physics makes me suspect this sentence is a simplification.
A normal printer use Cyan Magenta Yellow Black to print. A photo of such a print would already destroy alot of spectral information for the same reason the individual rgb sensors do.
So i suspect those colored dots are a very careful and deliberate concoction of very particular inks with very specific spectral color bands.
I suspect alot of effort went into finding, mixing and algoritmically combining the right inks.
I'm guessing it works similarly to a how a narrow band florescent lamp makes only materials that reflect a very specific frequency be visible, which makea alot of prints and pigments look wierd. (If you do the opposite; use ink with very specific spectral band, you can instead measure the lamp)
Wouldn't it be nice if they just told us so we didn't have to speculate? This is cool stuff and I'm glad I know about it but, as someone interested in this field of study, I'd love to try this out. But I guess I should stop being surprised when even a company like IEEE can't be bothered to write an article with any actual information. Just a bunch of simplified summarized crap.
Not just that, but it would presumably be sensitive to light emission spectra too. As inks can only reflect wavelengths of light that hit them, if the emission spectra has spikes or gaps - think LED or florescent - the reflected spectra will be a function of the light source[1].
Perhaps there's some accounting for this, and I'm curious to learn what it is, because it's a phenomenally complex problem.
1. You might think the sun is a standard source, but it's usually modulated by the atmosphere[2].
> Perhaps there's some accounting for this, and I'm curious to learn what it is
The slip itself is a calibration reference, so a clean photo of it could serve to compensate for the lamp and camera and calculate how accurate the readings is for different parts of the spectrum. (But good wide spectrum light would be ideal for high precision readout)
You're also still limited to visible light because of the camera uv and ir filter, for which the sun is a decent reference.
My gut feeling is that finding enough very specific wavelength shifting inks would be harder. Perhaps its a mix though to get good readings in the faar edges between the rgb wavelengths.
I hope there is a research paper on this i can read.
I similarly thought that just because they said print does not mean it was printed on someone's ink jet. I'd hate to see how many different Pantone colors might be necessary.
That's a bit of a bad faith take. You were welcome to go spend the years(?) this chaps dedicated to putting together the research required to build this. If it works, let him enjoy the fruits of labour.
Sure, if he'd come up with this primarly using his own resources and time, but he discovered this whilst being paid to conduct research at a public university, a form of institution which is explicitly intended to disseminate knowledge. Society should enjoy the fruits of its investment.
Patents do more than let you enjoy the fruits of your labor - the market already allows for that. Patents use the force of law to bar anyone else who might have discovered the same thing from building upon it.
Once can just as trivially construct an argument demonstrating the issue with patents but the problem with this style of argument is that patents are not a simple thing. They have global far reaching effects. The government distributing a monopoly on information is a serious interference with the market, and due to patent harmonization efforts across the world, one person filing a patent in New Jersey affects even people in Kenya and Turkey and Thailand. The arguments for patents are often, as I see it, based on a deeply flawed understanding of the motivations of innovators and the affects of open information on innovation. For example most arguments in favor of patents cannot explain how open source works, and so are clearly incomplete or outright wrong.
On the other hand those same corporations can generate, file, and litigate more patents than just a dude could ever hope to.
It's 2007. Just-a-dude has a great idea, he notices customers to his website often buy just one item, so he'll let them do that with one simple click. What's this, he's just received a cease and desist? Sorry bro, Amazon patented that 10 years ago.
I mean we’re basically getting the same result. Tons of businesses, not to mention patent trolls, constantly harass individuals and small businesses trying to get their foot in the door or just run a small, sustainable business. Hell forget my business failing, it’s possible I’ll never even get to try my idea out!
I notice the article doesn't say anything about accuracy. This is not my area, but I think the _other_ hacky way to try to do spectroscopy with a phone is with a diffraction
grating (and maybe a box with a slit in it). Diffraction gratings are cheap, probably not so different from a specially-printed reference card. If you have a choice, which is better?
diffraction grating wouldnt give you a controlled lighting environment (illuminant). they seem to handle that issue here by using a known spectral reference chart which might let them handle any normal lighting environment.
This doesn't really seem like "hyperspectral imaging". I think the idea is having a reference colour chart of known emission characteristics and photographing it through a transparent substance gives you an idea of how much that substance attenuates each wavelength.
It's a cool trick if it works, but it seems very finicky and I guess would be limited to transparent/homogeneous liquids?
In theory maybe you could build a version made of inks printed on a reflective mirror? And then you would hold the mirror so it reflected the object into the camera?
But that seems far more difficult. Precisely combining and applying combinations of inks to a mirrored surface sounds like a helluva manufacturing challenge.
Someone needs to build a phone that is leaning towards a tricorder; I'd buy that for myself and my kids. My Pixel 10 has a temp sensor on it, which is cool, but I've had minimal use so far.
I've always wanted to build a tricorder with my son, was just thinking about it last week when he was putting together a digital compass (with RasPi Nano, magnetic sensor, GPS, and LED light ring + OLED).
There are also cheap ($200-$400 range) usb-c thermal cameras specifically for phone use (they're cheap because they're just a sensor, the app on the phone is the "screen" and controls.) Great for narrowing down overheating hardware, and you can keep one in a pocket.
I don't understand how from 3 independent values per pixel (RGB) they claim to derive 200+ independent values per pixel. Unless they are assuming a smooth "image" (all pixels the same RGB), perturbed only by the color card? Not exactly a camera then
They're not claiming to get that many values per pixel, they're getting that many values overall for the medium through which light passes between the card and the phone. The idea light comes from a source (e.g. sun), bounces off the various colors of the card and thus produces hundreds of different spectra, those all pass through a medium, and land on the phone camera. So you're getting one measurement consisting of hundreds of RGB values that each represent intensity of different spectra, and you combine it all together to get a single spectrogram.
Funny enough, that’s what photographers are doing when they shoot a color checker chart (e.g., Munsell, Macbeth, X-Rite).
White balance is hard, in part, because the sensitivity bands of our vision and the camera sensors do not align. Take a look at fluorescent (or, better still, sodium vapor) light spectra for clarity on why this is a massive pain.
I was hoping that someone came out with a camera that not only had not only sensors for visible light, but for infrared and UV. It's just another color to add to the sensors; I think we have enough megapixels, seems like going for other bands is reasonable.
I have a OnePlus 8 Pro with an IR camera. It's pretty nifty - nature photography looks cool, seeing through stovetops is neat (and seeing when they heat up), and VR things are also often playing around with IR (plastic transparent to IR, IR LEDs, etc).
I ended up having to flash Lineage, as there was some outrage that in a highly limited set of circumstances, thin see-through T-shirts became slightly more see-through and OnePlus disabled that camera in their later firmware updates.
You have to love when amazing innovations disappear just in case the lowest-quality rung of our society might misuse something... I'm pretty sick of being ruled based on the lowest common denominator.
Back when I was in oil and gas, we were thinking of using modified mirror less cameras without and IR filter for vegetation density calculations. There were a few vendors that sold the UAVs and modified cameras.
Nowadays, there is a more mature ecosystem, with specialized drone mapping cameras tailored for the purpose.
For our use case, the micasense rededge would have been perfect.
CMOS image sensors are naturally sensitive to near IR. Early feature phones had no IR filters on their cameras - you could see an IR remote light up through them. But as people became more and more obsessed with smartphone camera quality, smartphones started to ship with those filters too. You get more "lifelike" colors that way.
Although in some multi-camera smartphones, one of the secondary cameras may lack an IR filter.
One of mine definitely lacks such a filter because I was able to catch not only the remote, but also an electric stovetop while it was still heating up and its glow was barely visible with the naked eye.
iPhones also have a near-IR front camera, but that one is fully slaved to the FaceID system. Don't think anything in userland can access raw data from it.
Those rely on the depth maps, which can be accessed from userspace. But the depth maps are derived from IR camera footage, which is not accessible.
Ironically, older iPhones have better depth resolving capability overall. Apple sacrificed depth sensing performance in favor of smaller unit size in the newer ones.
Patent-pending... again someone trying to rent-seek a high-school physics fair idea. Measuring light absorption with a camera is almost as old as the camera.
Using a known reflectance chart in-scene to recover spectral information is a standard calibration technique.
What you're referring to is color calibration. This is spectroscopy. This is likley more of a chemistry paper than a engineering paper because the ink in the reference chart is doing some heavy lifting.
This is really cool and very clever. But i want to raise one thing.
> designed a special color reference chart that can be printed on a card
My rudimentary understanding of physics makes me suspect this sentence is a simplification.
A normal printer use Cyan Magenta Yellow Black to print. A photo of such a print would already destroy alot of spectral information for the same reason the individual rgb sensors do.
So i suspect those colored dots are a very careful and deliberate concoction of very particular inks with very specific spectral color bands.
I suspect alot of effort went into finding, mixing and algoritmically combining the right inks.
I'm guessing it works similarly to a how a narrow band florescent lamp makes only materials that reflect a very specific frequency be visible, which makea alot of prints and pigments look wierd. (If you do the opposite; use ink with very specific spectral band, you can instead measure the lamp)
Insanely clever. (Whatever they did)
Wouldn't it be nice if they just told us so we didn't have to speculate? This is cool stuff and I'm glad I know about it but, as someone interested in this field of study, I'd love to try this out. But I guess I should stop being surprised when even a company like IEEE can't be bothered to write an article with any actual information. Just a bunch of simplified summarized crap.
At the bottom of the article is a link to the paper, which is open access.
Not just that, but it would presumably be sensitive to light emission spectra too. As inks can only reflect wavelengths of light that hit them, if the emission spectra has spikes or gaps - think LED or florescent - the reflected spectra will be a function of the light source[1].
Perhaps there's some accounting for this, and I'm curious to learn what it is, because it's a phenomenally complex problem.
1. You might think the sun is a standard source, but it's usually modulated by the atmosphere[2].
2. Unless you are in space.
> Perhaps there's some accounting for this, and I'm curious to learn what it is
The slip itself is a calibration reference, so a clean photo of it could serve to compensate for the lamp and camera and calculate how accurate the readings is for different parts of the spectrum. (But good wide spectrum light would be ideal for high precision readout)
You're also still limited to visible light because of the camera uv and ir filter, for which the sun is a decent reference.
oh, yes of course! Thank you :)
Ink is perfectly capable of being a phosphor, in which case it'll up or down convert wavelength X to wavelength Y.
My gut feeling is that finding enough very specific wavelength shifting inks would be harder. Perhaps its a mix though to get good readings in the faar edges between the rgb wavelengths.
I hope there is a research paper on this i can read.
Printing can use so-called spot colors.
I similarly thought that just because they said print does not mean it was printed on someone's ink jet. I'd hate to see how many different Pantone colors might be necessary.
If you only need one card per 10,000 photos, then the cost of the card starts to look cheap compared to a spectrometer and its bulk.
> The new patent-pending technique
> “Every photo carries hidden spectral information waiting to be uncovered. By extracting it, we can turn everyday photography into science.”
And with our patent, extract rent from anyone who wants to do it!
That's a bit of a bad faith take. You were welcome to go spend the years(?) this chaps dedicated to putting together the research required to build this. If it works, let him enjoy the fruits of labour.
Sure, if he'd come up with this primarly using his own resources and time, but he discovered this whilst being paid to conduct research at a public university, a form of institution which is explicitly intended to disseminate knowledge. Society should enjoy the fruits of its investment.
Patents do more than let you enjoy the fruits of your labor - the market already allows for that. Patents use the force of law to bar anyone else who might have discovered the same thing from building upon it.
Imagine you are just a dude, you did all this work, and go to "market".
You are just a dude, therefore business grows slowly.
You gather enough attention that some corporation with a lot of bling just goes and copies your thing.
Your business fails.
Once can just as trivially construct an argument demonstrating the issue with patents but the problem with this style of argument is that patents are not a simple thing. They have global far reaching effects. The government distributing a monopoly on information is a serious interference with the market, and due to patent harmonization efforts across the world, one person filing a patent in New Jersey affects even people in Kenya and Turkey and Thailand. The arguments for patents are often, as I see it, based on a deeply flawed understanding of the motivations of innovators and the affects of open information on innovation. For example most arguments in favor of patents cannot explain how open source works, and so are clearly incomplete or outright wrong.
you have it entirely backwards; patents dont protect just-a-dude, they protect the corporation.
how?
just-a-dude doesn't have a team of patent attorneys sitting in his back office waiting for work.
On the other hand those same corporations can generate, file, and litigate more patents than just a dude could ever hope to.
It's 2007. Just-a-dude has a great idea, he notices customers to his website often buy just one item, so he'll let them do that with one simple click. What's this, he's just received a cease and desist? Sorry bro, Amazon patented that 10 years ago.
I mean we’re basically getting the same result. Tons of businesses, not to mention patent trolls, constantly harass individuals and small businesses trying to get their foot in the door or just run a small, sustainable business. Hell forget my business failing, it’s possible I’ll never even get to try my idea out!
I notice the article doesn't say anything about accuracy. This is not my area, but I think the _other_ hacky way to try to do spectroscopy with a phone is with a diffraction grating (and maybe a box with a slit in it). Diffraction gratings are cheap, probably not so different from a specially-printed reference card. If you have a choice, which is better?
diffraction grating wouldnt give you a controlled lighting environment (illuminant). they seem to handle that issue here by using a known spectral reference chart which might let them handle any normal lighting environment.
I would think in the same environment you would take images immediately before and after adding the sample.
This doesn't really seem like "hyperspectral imaging". I think the idea is having a reference colour chart of known emission characteristics and photographing it through a transparent substance gives you an idea of how much that substance attenuates each wavelength.
It's a cool trick if it works, but it seems very finicky and I guess would be limited to transparent/homogeneous liquids?
In theory maybe you could build a version made of inks printed on a reflective mirror? And then you would hold the mirror so it reflected the object into the camera?
But that seems far more difficult. Precisely combining and applying combinations of inks to a mirrored surface sounds like a helluva manufacturing challenge.
Someone needs to build a phone that is leaning towards a tricorder; I'd buy that for myself and my kids. My Pixel 10 has a temp sensor on it, which is cool, but I've had minimal use so far.
I've always wanted to build a tricorder with my son, was just thinking about it last week when he was putting together a digital compass (with RasPi Nano, magnetic sensor, GPS, and LED light ring + OLED).
Take a look at the phyphox app - https://phyphox.org/
Wow! That app is amazing thanks for sharing
This is brilliant.
Caterpillar has a smartphone with a thermal camera. The price isn't far off the the price of the most expensive smartphones
https://cat.smartwalkie.com/store/products/cats62pro
There are also cheap ($200-$400 range) usb-c thermal cameras specifically for phone use (they're cheap because they're just a sensor, the app on the phone is the "screen" and controls.) Great for narrowing down overheating hardware, and you can keep one in a pocket.
how does Cat's self repair policy compare to John Deere's? Then again, it's not far off from Apple's
I don't understand how from 3 independent values per pixel (RGB) they claim to derive 200+ independent values per pixel. Unless they are assuming a smooth "image" (all pixels the same RGB), perturbed only by the color card? Not exactly a camera then
They're not claiming to get that many values per pixel, they're getting that many values overall for the medium through which light passes between the card and the phone. The idea light comes from a source (e.g. sun), bounces off the various colors of the card and thus produces hundreds of different spectra, those all pass through a medium, and land on the phone camera. So you're getting one measurement consisting of hundreds of RGB values that each represent intensity of different spectra, and you combine it all together to get a single spectrogram.
This could improve chromatic adaptation of captured images. In other words, better results when changing the white point.
Funny enough, that’s what photographers are doing when they shoot a color checker chart (e.g., Munsell, Macbeth, X-Rite).
White balance is hard, in part, because the sensitivity bands of our vision and the camera sensors do not align. Take a look at fluorescent (or, better still, sodium vapor) light spectra for clarity on why this is a massive pain.
I was hoping that someone came out with a camera that not only had not only sensors for visible light, but for infrared and UV. It's just another color to add to the sensors; I think we have enough megapixels, seems like going for other bands is reasonable.
I have a OnePlus 8 Pro with an IR camera. It's pretty nifty - nature photography looks cool, seeing through stovetops is neat (and seeing when they heat up), and VR things are also often playing around with IR (plastic transparent to IR, IR LEDs, etc).
I ended up having to flash Lineage, as there was some outrage that in a highly limited set of circumstances, thin see-through T-shirts became slightly more see-through and OnePlus disabled that camera in their later firmware updates.
You have to love when amazing innovations disappear just in case the lowest-quality rung of our society might misuse something... I'm pretty sick of being ruled based on the lowest common denominator.
Back when I was in oil and gas, we were thinking of using modified mirror less cameras without and IR filter for vegetation density calculations. There were a few vendors that sold the UAVs and modified cameras.
Nowadays, there is a more mature ecosystem, with specialized drone mapping cameras tailored for the purpose.
For our use case, the micasense rededge would have been perfect.
Same! I want to be able to capture more of the spectrum already!
I know many full size cameras have filters to specifically remove IR and UV from the images. Is this true for smartphones as well?
Yes.
CMOS image sensors are naturally sensitive to near IR. Early feature phones had no IR filters on their cameras - you could see an IR remote light up through them. But as people became more and more obsessed with smartphone camera quality, smartphones started to ship with those filters too. You get more "lifelike" colors that way.
Although in some multi-camera smartphones, one of the secondary cameras may lack an IR filter.
One of mine definitely lacks such a filter because I was able to catch not only the remote, but also an electric stovetop while it was still heating up and its glow was barely visible with the naked eye.
Some phones had near-IR camera (Pixel 4, Samsung S10) accessible via API. No "killer app" was found since then, 5+ years
iPhones also have a near-IR front camera, but that one is fully slaved to the FaceID system. Don't think anything in userland can access raw data from it.
There are lots of 3D scanning apps using "Face ID", like Heges: https://hege.sh/
Those rely on the depth maps, which can be accessed from userspace. But the depth maps are derived from IR camera footage, which is not accessible.
Ironically, older iPhones have better depth resolving capability overall. Apple sacrificed depth sensing performance in favor of smaller unit size in the newer ones.
Patent-pending... again someone trying to rent-seek a high-school physics fair idea. Measuring light absorption with a camera is almost as old as the camera.
Using a known reflectance chart in-scene to recover spectral information is a standard calibration technique.
What "investment" is patent law protecting here?
What you're referring to is color calibration. This is spectroscopy. This is likley more of a chemistry paper than a engineering paper because the ink in the reference chart is doing some heavy lifting.