If you like what you found here :

If you like this blog and if it was useful to you, I would like a small retribution in form of a charity donation for one of these animal shelters. They're awesome people and really need help. Thank you !

Monday, November 19, 2018

Apple Aperture is Not Dead (for now)

All Apple users may know by this time that Apple stopped to maintain its image editor and DAM software known as Aperture. Pure and simple, Apple decided that Aperture has reached the end of life cycle.

There are many rumors behind this decision, from a supposed deal with Adobe, to more simple things like cutting costs. Personally, I don't believe the last option, but who knows ?

The fact is that Aperture still rocks. And rocks well.

So, what's so great with Aperture ?

It's rock stable, period. Everything as tested to exhaustion. I can't remember about a single bug in it.

It's fast. Runs well on old computers and not a memory hog.

The Digital Assets Management (DAM) function is indeed extremely good. It can handle large image collections and organize them in projects, catalogs, and albums. The rating system is near perfect, with color tags, star rating, keywords and other flags.

The searching tool is by far the most powerful one ever implemented in any DAM of my knowledge. It's possible to search by all sorts of parameters, alone or combined.

I tested it with libraries having more than 80K images without any problem. It's possible to use more than one library and to move images between them.

It's possible to consolidate all images inside the library itself or let them outside, just referencing them to the internal database.

There is a very neat backup system called "Aperture Vault". It's a library snapshot that can be sent to anywhere you want, to a local folder, external hard disk and even networked drives. 

The image editor is very nice if you don't need to work with layers or do cut-paste-make-boobs-larger and such things, it's a good tool for photo adjustments, like colors, tones, retouching, levels, etc. The retouching and cropping tools are simply perfect.

The raw engine is part of the operational system and it's updated according to with Apple's policy to the OS you have installed. It includes literally hundreds of camera models and the updates come as an OS update.

If you need, it's possible to use an external editor with Aperture. I use Affinity Photo for this.


There are many interesting plug-ins compatible with Aperture. My favorites are:


The Nik Collection:

Analog Efex Pro: Simulates film/vintage look
Color Efex Pro: Effects, retouching and correction tools
Silver Efex Pro: Simulates Black and White film
HDR Efex: Self-explanatory
DFine: Noise reduction tool
Viveza: Simple sharpening tool
Sharpener Pro: A more complete sharpening tool 

Note: Until mid-2018 the Nik Collection was owned by Google and available as freeware, but DxO acquired it and now it's paid (and expensive) and I'm not sure if the new version is still compatible with Aperture. If you have the old free version, keep it safe !

There are other cool plugins that deserve a look:

DxO Film Pack: Professional film look simulation
DxO View Point: Professional geometrical corrections

Noiseware Professional had an Aperture plug-in in the past, but it was discontinued. It was a wonderful professional noise reduction tool. If you have it, grab with both hands !

The later versions of the RAW engine can handle fairly well the Fuji X-Trans raw files. Much better than Adobe Camera Raw. 


But honestly, read the manual. There are so many features that you can overlook. I know that reading manuals suck, but at least try.


There are other quite neat features, like dozens of high-quality plug-ins and a fantastic degree of integration with Apple Automator and Applescript.

The bad thing is that Apple removed it from Application Store, but if you purchased it before, you'll be able to download it again. My advice is to make a backup copy of the application itself and keep it in a safe place. 

So, if you have a Mac, take a look at it while you still can.

It's still working on MacOSX 10.14 Mojave but I have a slight impression that things will change with the next MacOS version... I also have the impression that I will NOT upgrade beyond Mojave.

Actually, with the advent of Affinity for Windows and all the important photography software being ported to Windows and the fact Apple is again making Mac user's life miserable again due to soldered memory and SSDs, making Macs not user upgradable or serviceable, I see no point to having a post-2018 Mac. I don't like Microsoft Windows, but I like even less the new approach from Apple.

Macs are becoming just too expensive and a potential headache along time.

P.S.

Some of the cameras supported by the High Sierra raw engine aren't supported by Aperture. Surely Apple did this on purpose to force people to use Apple Photos.





Wednesday, October 3, 2018

Foveon page split in 3

Hi,

I just split the original Sigma Foveon page in three, divided in:

- Vintage Sigma Foveon (SD9/10/14/15 and DP1/DP2 non Merrill)

- Merrill Cameras (SD1M, DP2M, DP3M)

- Quattro Cameras (working on it)

The original page was growing too much.
Check on the navigation bar at the top of the page.

Enjoy !

Saturday, September 29, 2018

Why still use a 4.7 MP Foveon camera in 2016 ?

Updated on Sept. 26th 2018

"Low Resolution" Foveon Cameras


From time to time someone asks me why I still use a 4.7 MP camera in today. People usually believe that the more resolution, the better the images are and this is not so simple.

Depending on the use we don't need a high-resolution image. What you need is a proper pixel density to have a good viewing experience. What's this ?

For example, if you like to make printed photos, let's say, in postcard size (6x4 inches or 10x15 cm) you don't need many megapixels image to have a perfect looking print.

This is because the average human eye is virtually unable to see any difference in a printed (or projected) image at more than 300 dpi at a distance of 20 cm. This means that if you print the same image at the same size in 300 and 1000 dpi you simply won't be able to note any difference between them.

Let's do some simple calculations:

For a postcard size print, printed at 300 dpi we need (6 x 300) x (4 x 300) pixels = 1800 x 1200 pixels = 2160000 pixels = less than 2.2 megapixels.

For a larger A4 size (roughly 8.3 x 11.7 inches) we need 8.5 megapixels (8 will do fine)

To display a photo fullscreen on a DCI-4K UHD TV (4096x2160 pixels) we need 8 megapixels to fill all the screen pixels but depending on the viewing distance 4 or 5 megapixels would do fine.

Larger resolutions are good if you need to crop the image or if you're a pixel peeper and want to have some fun looking it at 1:1 size. For larger prints at 300 dpi, obviously, you will need a higher resolution.

For poster size prints, you usually look at it from a distance and the print resolution can be lowered due the human vision nature. You can use an online calculator to check what print resolution you need, based on the print size and viewing distance.

A very good one here
Another good tool is found here

In other words, you may not need the resolution you think.


Those old Sigma cameras, from the pre Merrill age like the SD14/15 and DP1/2 are more than enough if you don't need really large prints. Images from them can be easily interpolated by a 1.5 factor without any visible quality loss.

An interesting point of the Foveon images is the absence of color alias. It's possible to upscale the 4.7 MP image without any perceivable quality loss to 150% (linear) and with acceptable quality at 200%. The Sigma Photo Pro raw converter is very good for upsizing.


The output resolution of those 4.7 MP cameras (DP1/2 and SD14/15) is 2652 x 1768 pixels.

Upsizing it to 150% leads to a 3978 x 2652 = 10.5 MP, more than enough for an A4 or Letter size print.


Just take a look at this image:



Sigma DP2 sample, from Sigma website
(C) Gris
and this other one:


Sigma DP2 sample, from Sigma website
(C) Gris

Go on... Use SPP to upsize them to 150% and print with a good paper, ink and printer and see by yourself.

Considering the fact that any of those Sigma cameras with 4.7 MP (x3) are very cheap now, it may worth a try.

The last example was done using a Jurassic Sigma SD10, the second dSLR from Sigma and it's just 3.3 MP.


Sigma SD10 image example
The image was first converted from X3F to TIFF using Sigma's own program SPP version 6 and then processed again on Affinity Photo for some color correction and tone curve adjust. Honestly, it's a very high-quality image, but small. Still impressive today and easily up sizable to 4K to be displayed on a UHD TV or monitor.





Friday, September 28, 2018

Kodak Signet 35

Kodak Signet 35
(I know, I'll make a decent picture ASAP)


This is a very nice small camera made by Kodak, USA, in the '50s. The lens is a super sharp 44mm F3.5 Ektar, a Tessar based design, with diaphragm setting from f3.5 to f22. To be honest this lens sharpness is a match to the Rollei 35 Tessar.

At the top plate it has the film advance and rewind knobs and the frame counter. The counter is, like many Kodaks from this era, a regressive counter. You need to set it manually according to the film number of exposures. For example, for a 12 exposure film, you set the counter to 12. It will decrease by one for each frame shot. I really don't like regressive mechanical counters because it's very easy to forget to set it properly in the beginning. 

The viewfinder is small and not very bright, has no parallax compensation neither framing lines, but at least has a decent coincident image rangefinder and good contrast. The rangefinder itself is precise, contrasty and very smooth. Even more impressive is the fact it's still perfect after more than 60 years!

The shutter is very simple with just the "high" speeds: 1/25s, 1/50s, 1/100s and 1/300s plus B. It's reliable and has a very characteristic sound. This camera is really built like a tank. It's solid, very solid. For what I saw during the CLA on it, this simple shutter mechanism looks very reliable.

It's very easy to CLA. The complete shutter and optics assembly can be removed just unscrewing with the help of a spanner the module screw from inside the camera. It will come out in one piece.
To remove the front lens group just grab it and turn it counterclockwise and unscrew it. It's easy to remove the front plate and get access to the shutter components.

Kodak Signet lens: Ektar 44mm f3.5

You'll also note that this camera has no light meter, but an interesting slide rule for exposure calculation on the camera back. Too bad it's not marked with ISO/ASA values, but by old Kodak film types (Kodachrome, Plus-X, XX and Pan-X). I have no idea of the original specs of the mentioned Kodachrome. Plus-X was an excellent 125 ISO low grain film and the XX was a "high speed" black and white film with sensibility of ISO 200. Not sure if this Pan-X meant the Panatomic-X ultra low grain (32 ISO). Tho be honest, thank's Kodak for NOT putting a meter on it.

Kodak Signet 35 exposure calculator 

The idea is to set the film type and the light conditions and get an exposure value. A bit clumsy but better than nothing. I would carry a handheld light meter or even a cell phone app for this function.

The only quirk is the fact that the shutter needs to be manually cocked before each picture, but at least it has a double exposure prevention lock that works very well. If you want to make a double exposure, there is a manual override for this, by moving a small lever at the bottom of the camera front. Easy to forget to cock, but far more reliable than the dreaded Kodak Retinas mechanisms.

Kodak Signet 35 back

There are two military versions, a black one (USAF) and a green one (US ARMY). Both are highly prized and can reach more than USD 500 easy if in good condition. The standard model is easily found on e-Bay for just a few bucks and worths every cent.

I got mine for $25 in a so-so condition but working. I had to clean the objective and also did an ultrasound cleaning of the shutter assembly and a basic CLA. The rangefinder assembly needed just some minor cleaning.

Final note:

The Retinas are some sort of cult cameras, they have excellent optics but they are very unreliable. They're hard to find working well and a bit overpriced.

This Signet is just the opposite: Simple, reliable, basic and cheap, but with a superb lens.



Tuesday, July 3, 2018

The Sky Color and Polarization

A Very Basic Introduction to Light Scattering

Scattering is a complex physical process where electromagnetic waves like the visible light, microwaves, X-Rays, and some mechanical waves like the sound, e suffers some degree of deviation from a straight line due to some particles in the path that the wave would supposedly pass. 

This process may be derivative from collisions between a myriad of things and from subatomic particles other than photons, like electrons, neutrons and molecules to larger particles dust in suspension and water vapor droplets, just to mention the most common ones. 

Near perfect reflections, like the incidence and emergence angles are due to the scattering process, but other types of reflexions like specular and diffuse are also possible by this process.

Let's concentrate on the very basic aspects of light scattering. Believe me, it's a very complicated process to be deeply analyzed. If you're interested, here is a good starting point. A further deeper analysis is beyond the scope here.

One of the most known visual effects of the light scattering is when someone sees a distant lamp on a foggy night. You will perceive that the light makes the fog visible (of course) but some other interesting effects are also visible, like halos (they are a combination of scattering, refraction and some other phenoms).

If you put some drops of milk in a water glass and point a strong collimated or focused light source through the class you will "see the light beam". The same thing will happen if you use a laser. It can be seen because of the Tyndall Effect (Willys-Tyndall scattering). The Tyndall effect occurs when the particle size in suspension or emulsion is near the wavelength of the light.

Milk is a colloid made from water, fat, cells and other substances, but milk is not exactly transparent, that's the reason to dilute just some drops on the water to make a usable colloid.    

When I studied chemistry at the university, we used to demonstrate the Tyndall Effect using a beaker with something like 1cm of metallic mercury at the bottom and most of the beaker filled with distilled water. Then we passed two insulated wires with just the tip exposed, one going directly to the beaker bottom, connecting the metallic mercury to one pole of a 12V battery and the other wire, connected to the other pole. Then we started to insert the second wire on the water until it touched the metallic mercury. At the contact moment, a spark was formed and a small quantity of mercury was vaporized but immediately condensed into extremely small droplets (some nanometers to micrometer size). The result was what we call a colloid (water-mercury colloid) and it was very clear, but when we pointed a lease beam to it, the beam turned perfectly visible.

Things become more interesting when you have powerful laser beam perfectly visible in the atmosphere.

Laser used as part of the Adaptive Optics System of the ESO telescopes
(C) ESO - Babak Tafreshi - All rights reserved

But the laser beam itself shouldn't be visible (it isn't in the vacuum).  It occurs when the light is scattered by particles in suspension on the atmosphere.

The same scientist John Tyndall made some very interesting experiments in the late XVII century with polarized light and finally demonstrated that the cause of this polarization was mainly (not only) due to the light from the sun being scattered by particles in atmospheric suspension.

If you like to read about classic experiments, I recommend to dig about Tyndall's publications, they are an awesome lecture. Here is his famous article "On the Blue Colour of the Sky, the Polarization of Skylight, and on the Polarization of Light by Cloud Matter Generally", reproduced from Proceedings of the Royal Society of London Volume 17 of 1868-1869. You can freely download it from JSTOR.

He discusses and demonstrates the nature of why vapors made from tiny particles of solids or liquids looks bluish (smoke for example). During this experiment he measured the light polarization (using a Nicol prism) of the light when it passed through a vapor filled glass tube, illuminated by an external light source. He did the polarizing angle measurements from all directions around the tube.

He had a brilliant and extremely important insight:

When a plate of tourmaline was held between the eye and the bluish cloud, the quantity of light reaching the eye when the axis of the prism was perpendicular to the axis of the illuminated beam, was greater than when the axes of the crystal and of the beam were parallel to each other. 

This was the result all round the experimental tube. Causing the crystal of tourmaline to revolve round the tube, with its axis perpendicular to the illuminating beam, the quantity of light that reached the eye was in all its positions a maximum. When the crystallographic axis was parallel to the axis of the beam, the quantity of light transmitted by the crystal was a minimum. 

From the illuminated bluish cloud, therefore, polarized light was discharged, the direction of maximum polarization being at right angles to the illuminating beam; the plane of vibration of the polarized light, moreover, was that to which the bean was perpendicular. 

John Tyndall

Keeping it simple, this means that the polarization effect is ALWAYS at its maximum when the light source is perpendicular to the illumination axis and explains why when we use a polarizer filter when photographing the sky the darkening (polarizing) effect will be at its maximum strength when the sun is located at 90 degrees from where the polarizer is aimed. Remember that we are talking about angles on a sphere, not in a plane. This also means that when the light source is located at a zero (or 180) degrees, the effect will be minimum.  

He continued this awesome experiment trying to figure the nature of the bluish color of the studied vapor clouds.

To make it short and less boring to photographers to read, he demonstrated that the sky is blue because of the light being scattered by very small particles, with the size near to the blue light wavelength and the scattering caused by water droplets and ice crystals from the sky clouds are white because their "component particles" are much larger than, scattering more wavelengths (colors).

A shorter and more "modern" explanation of the sky's blue color can be found on the Physical Notes of the Boston University:

"The way light scatters off molecules in the atmosphere explains why the sky is blue and why the sun looks red at sunrise and sunset. In a nutshell, it's because the molecules scatter light at the blue end of the visible spectrum much more than light at the red end of the visible spectrum.

This is because the scattering of light (i.e., the probability that light will interact with molecules when it passes through the atmosphere) is inversely proportional to the wavelength to the fourth power. 

Violet light, with a wavelength of about 400 nm, is almost 10 times as likely to be scattered than red light, which has a wavelength of about 700 nm. At noon, when the Sun is high in the sky, light from the Sun passes through a relatively thin layer of atmosphere so only a small fraction of the light will be scattered. 

The Sun looks yellow-white because all the colors are represented almost equally. At sunrise or sunset, on the other hand, light from the Sun has to pass through much more atmosphere to reach our eyes. Along the way, most of the light towards the blue end of the spectrum is scattered in other directions, but much less of the light towards the red end of the spectrum is scattered, making the Sun appear to be orange or red.

So why is the sky blue? Again, let's look at it when the Sun is high in the sky. Some of the light from the Sun traveling towards other parts of the Earth is scattered towards us by the molecules in the atmosphere. Most of this scattered light is light from the blue end of the spectrum, so the sky appears blue. 
Why can't this same argument be applied to clouds? Why do they look white, and not blue? It's because of the size of the water droplets in clouds. The droplets are much larger than the molecules in the atmosphere, and they scatter light of all colors equally. This makes them look white. 
ers off molecules in the atmosphere explains why the sky is blue and why the sun looks red at sunrise and sunset. In a nutshell, it's because the molecules scatter light at the blue end of the visible spectrum much more than light at the red end of the visible spectrum. This is because the scattering of light (i.e., the probability that light will interact with molecules when it passes through the atmosphere) is inversely proportional to the wavelength to the fourth power. 

Violet light, with a wavelength of about 400 nm, is almost 10 times as likely to be scattered than red light, which has a wavelength of about 700 nm. At noon, when the Sun is high in the sky, light from the Sun passes through a relatively thin layer of atmosphere so only a small fraction of the light will be scattered.

The Sun looks yellow-white because all the colors are represented almost equally. At sunrise or sunset, on the other hand, light from the Sun has to pass through much more atmosphere to reach our eyes. Along the way, most of the light towards the blue end of the spectrum is scattered in other directions, but much less of the light towards the red end of the spectrum is scattered, making the Sun appear to be orange or red.

So why is the sky blue? Again, let's look at it when the Sun is high in the sky. Some of the light from the Sun traveling towards other parts of the Earth is scattered towards us by the molecules in the atmosphere. Most of this scattered light is light from the blue end of the spectrum, so the sky appears blue. 

Why can't this same argument be applied to clouds? Why do they look white, and not blue? It's because of the size of the water droplets in clouds. The droplets are much larger than the molecules in the atmosphere, and they scatter light of all colors equally. This makes them look white."  

The last paragraph is 100% according to Tyndall's observations and conclusion.  

If you like Wikipedia, they also have an excellent explanation about the Rayleigh sky modelbut it's a complex matter reduced to bare bones, so if you're not familiar with physics and optics at university level, it may be difficult to digest.

From a Wikipedia's article, I bring here an important diagram that is completely according to Tyndall's observations.

Rayleigh sky geometry
source: Wikipedia

Explanation:

"The geometry for the sky polarization can be represented by a celestial triangle based on the sun, zenith, and observed pointing (or the point of scattering). In the model, γ is the angular distance between the observed pointing and the sun, Θs is the solar zenith distance (90° – solar altitude), Θ is the angular distance between the observed pointing and the zenith (90° – observed altitude), Φ is the angle between the zenith direction and the solar direction at the observed pointing, and ψ is the angle between the solar direction and the observed pointing at the zenith.

Thus, the spherical triangle is defined not only by the three points located at the sun, zenith, and observed point but by both the three interior angles as well as the three angular distances. In an altitude-azimuth grid the angular distance between the observed pointing and the sun and the angular distance between the observed pointing and the zenith change while the angular distance between the sun and the zenith remains constant at one point in time.

From Wikipedia"

Going back to the maximum and minimum polarization they are brilliantly explained by the Q and U Stokes parameters. Yes, that Stokes, Sir George Gabriel Stokes. This guy was a true mathematical and physics badass. If you have the guts, here are some sources for your fun:



But honestly, I would not advise you to dig a lot about this subject. It's extremely complex and needs a deep and solid physics and calculus base to start to understand it.

Ok, back again to the resumed Wikipedia article on the Rayleigh sky. Here are the most important notes, quoted from there (you can check by yourself later if you think you need)

  1. When the sun is located at the zenith, the band of maximal polarization wraps around the horizon. Light from the sky is polarized horizontally along the horizon. (note: the horizon is at 90 degrees related to the zenit)
  2. When the sun is near the horizon, the maximum polarization is, again at 90 degrees from the horizon, or in the Zenit.
  3. Note that because the polarization pattern is dependent on the sun, it changes not only throughout the day but throughout the year. (note: the apparent position of the sun changes during the day)
  4. Many animals use the polarization patterns of the sky at twilight and throughout the day as a navigation tool. Because it is determined purely by the position of the sun, it is easily used as a compass for animal orientation. By orienting themselves with respect to the polarization patterns, animals can locate the sun and thus determine the cardinal directions. (Bees and birds would have a huge problem on contrary)
  5. As the sun sets due West, the maximum degree of polarization can be seen in the North-Zenith-South plane. Along the horizon, at an altitude of 0° it is highest in the North and South, and lowest in the East and West. Then as altitude increases approaching the zenith (or the plane of maximum polarization) the polarization remains high in the North and South and increases until it is again maximum at 90° in the East and West, where it is then at the zenith and within the plane of polarization.


Don't blame me for quoting Wikipedia. This explanation is PERFECT and I dare you to prove the contrary if you disagree.

A very nice (and not so technical) explanation of the involved processes can be found at this web page . The explanation is so good that I decided to cut my own text to the minimum. Polarization.com also have some very interesting material for further reading.

Another interesting lecture about Polarized Light Patterns in The Sky, specifically the item number 2.

So, we have some interesting situations:


  1. Sun at Zenit: Maximum polarization (MaxPol) on a 360 degrees circle at the horizon.
  2. Sun at Horizon, let's say to the east. MaxPol is at a semicircle from north to south, passing through the Zenit. MinPol points are at east and west near the horizon.
  3. For other directions just follow the same rule, but don't forget that the angles and vectors are on a sphere.


Now I think you understand "the why of" that annoying dark bands when you photograph the sky at some angles using a wide angle lens and a polarizer.

Sorry, there is nothing you can do to eliminate this unless at post-processing or using a lens with a narrower field of view.

One more thing:

There are two types of polarizer filters, linear and circular. Linear polarizers are far more efficient to polarize the light from the sky than the circular one.

This can be a good or a bad thing depending on how do you want to enhance or attenuate the polarization. Linear polarizers tend to make the effect stronger, then sometimes the banding sky will be more visible or less visible depending on the polarizer type used at the moment

Circular polarizers are needed when your camera system has a mirror. This is because the mirror will modify the polarization and some light meter and auto focus systems sensors are placed behind a semi-silvered mirror. This difference in polarization may impact on the readings.

In case you're interested in a more detailed explanation, please check this page at Lindsey Optics website. Bob Atkins also has a very cool explanation on his website.


There are other aspects that can change the sky color, like pollutants and dust. It's a well-known phenomenon that the sun gets redder at the sunrise and sunset. This is due to the combination of two things, scattering and absorption.

Larger particles in suspension in front of the sun will block the smaller wavelengths (the blue-purple part of the solar spectrum).

This phenomenon is also the cause of why do we perceive the sun color as yellowish, caused by the short wavelengths scattering by the extremely small particles and absorption by the larger ones. The Sun looks white from space.

Scattering and absorption are extremely common processes also in deep space. Looking at nebulas, you will see both processes and also a third one called emission.


Lagoon Nebula in Visible light (left) and infrared (right)
Hubble Space Telescope - (C) NASA 
This beautiful image has very interesting examples of what I said. There is a very powerful light source at its center, a star, emitting huge amounts of all sort of electromagnetic waves, from infrared to x-rays. The bluish color is caused by the dust behind the star being reflected. The dust particles are so small that longer wavelengths cannot be reflected. The dark areas are caused by light absorption. The shorter wavelengths are being absorbed by the dust between us and the light. This is one of the reasons it looks reddish.

The image at the right was taken using an infrared sensor and it looks completely different. Infrared light wavelength is much longer and the very small dust particles are much less efficient on blocking it. This is why we can see through the dust cloud and see this myriad of stars.

Some of the red/pink light is caused by the ultraviolet light from the star ionizing the hydrogen present in the nebula.

And, to complete the frame, the light is also polarized!











  

Tuesday, May 22, 2018

Fujifilm X-TRANS RAW annoyances

This is not a review. Just some short comments about the most important programs that can handle X-Trans raw files.


Updated on October 18th, 2018

INTRODUCTION X-Trans raw files are a pain to process. Really.

The complicated nature of its non-standard demosaic algorithm is a real nightmare for both users and developers. 

From the user side, there are just a few programs that are able to extract the full potential of this new class of color matrix, and all of them, besides the awfully complicated RawTherapee and Fuji's OEM version of Silkypix are paid and usually expensive.

Things aren't different from the developer's point of view. There is almost no technical information about the mathematical methods for the decoding process, and many of the amazing sharpen and noise reduction algorithms used by well-known programs just can't cope with nothing else than the traditional Bayer pattern. That's why, for example, DxO just gave up on any non Bayer sensors.

These are my own opinions based on my experience and needs. Of course opinions can vary from people to people.



The most practical options on the market due this day are:


IRIDIENT DEVELOPER

It's a very powerful software and probably the best one in terms of extracting the highest detail from the raw files. It's really good on this and also on noise reduction and film profiles. The price is fair.

The drawbacks are a horrible interface and it's slow. By horrible and slow I meant really horrible and slow. Its features are sometimes hidden in non-obvious places and I never managed to find how to apply setting on a group of images in real time. I suggest you to try to do this.

But it's the best program if you want to have extremely detailed images from the X-Trans sensors.

Don't forget to download the film simulation profiles.

Pros: Generally very good results if you don't need to push too much sharpness.


Cons: Annoying interface and SLOW. Sometimes the sharpening adds funny artifacts that look like noise.


FUJIFILM RAW FILE CONVERTER EX 2.0 (OEM SILKYPIX)

It's a stripped down Silkypix 4. It's free, so don't complain. =)

The interface is way better than the previous program but still have some serious problems with (maybe) the Japanese to English translation. Some terms are just too weird.

The overall results are acceptable but it's slow like hell. Can be upgraded to the last SP Pro version for $150. Not a cheap upgrade, but not awfully expensive.

Not recommended. It's slow like continental drift.


SILKYPIX 5,6,7,8

The 8 Pro is a solid program, but the full price is in my opinion too high, about $250. 

The good thing is that it has support for the camera's built-in film simulation profiles. The sharpening algorithms are very good and way better than the equivalents found in Photoshop, Lightroom and Capture One.

They offer upgrades for several OEM and previous versions for a reasonable price, from $250 to $100 in some cases. Worths checking.

After using SP 8 Pro for more than one month, I'm very impressed with the final result. The sharpen and noise reduction are maybe the best for X-Trans and the colors are far more accurate than any other competitors.

Pros: Generally good results IF you find out how to master the sharpen and noise reduction process because they work together. I usualy choose the "Natural Sharp" option.

Cons: You really need to read the manual. There are lots of "hidden" important features and not the fastest program.

Forget about any version lower than 7 Pro. 



RawTherapee

It's free, fast, well documented BUT it's by far the most complicated image processing program I've ever used in my life.

The sharpening and noise reduction functions are overwhelming and extremely comprehensive. Curves, color and histogram operations unfold on many, many variations enough to make you scream when you see them for the first time.

The results can be awesome if you don't go nuts using it.

Pros: Can deliver stunning results and There are really good film simulation profiles for it.

Cons: Extremely complex and you need to be cross between a Zen monk and an Image Scientist to master it. Not kidding. 



CAPTURE ONE and LIGHTROOM

Well, they work but honestly, their raw engines are not a match for neither one of the above-mentioned programs and they are very expensive for what they are. You need to do a true Olympic marathon in adjustments to get "near" the image output from the other ones, so I won't even comment on them. I just gave up on them.

Capture one is a solid option, but a bit pointless for a X-Trans camera. It handles X-Trans files reasonably well, but there are better options for less than the $300 they ask for it. The only reason I can imagine to buy it is IF you're a very heavy C1 Media Pro user AND also use a Canon or Nikon camera in tethering mode, otherwise is a complete waste of money. I've tried it by 60 days in early 2018 and I was not convinced.

Lightroom is almost a religion, and forgive me fanboys, its X-Trans support still sucks in plain mid-2018 and I would never, ever use a subscription-based software. It will be a money sink in the long run, just do your calculations and check for yourself. It's evolving but there are better options.

CAPTURE ONE FOR FUJI

In October 2018, Phase One released two cheaper versions of their Capture One, specific for Fuji owners. There are two versions, a free "Express" and a paid "Pro". Both have the same raw engine but the Pro version has more features, like layers, retouching tools and tethering. If you don't need these features and do using another program like Affinity Photo or something else, do yourself a favor and stick with the Express version. My advise is to download before they change their minds and discontinue it. 


AFFINITY PHOTO

This was a surprise. Its raw engine is still under development but I got some interesting results from it with x-trans files. I have to do some more experiments before giving it an honest rating.

There are many third-party Photoshop plug-ins that will run on Affinity. It's also compatible with Nik's collection.

Pros: It's FAST and good. Compatible with Nik collection, DxO Film Pack and Noiseware Professional plugins from Photoshop.

Cons: No film profiles


Apple Aperture

If you're an Apple user and still have Aperture installed and updated, incredibly, it is capable of very good results, pairing even with Iridient. Images tend to have less noise. You need to add the raw fine tuning control.

The raw engine is OS dependent so it will support all cameras supported by Apple Raw Engine. The updates are OS updates, not Aperture's.

For film simulation, I use DxO Film pack Aperture plugin. The response is a bit different than Silkypix's but I'm very pleased with it.

 


Pros: Very nice final look with good colors and well-controlled sharpness. Compatible with Nik collection, DxO Film Pack and Noiseware Professional plugins for Aperture.

Cons: Aperture reached the end of life development. It still works on MacOS High Sierra but who knows the future. No film profiles.

If you're a Mac user, my advice is to keep it while you can and avoid updating the OS every time Apple tells you to do.

I have a spare bootable OSX High Sierra and Aperture on an external hard disk just in case.


RAW Power


If you have a Mac and like simple programs, RAW Power, from the former Aperture developer (the person himself) does a very decent job. It's essentially a nice front end to the internal operating system raw engine from Apple. 

If you're a Mac user and don't have Aperture, I would recommend RAW Power as an extension to Apple Photos or as a stand-alone program

Needs to evolve but it's cheap and honest and can be used as a Photos extension.

Desperately needs a built-in file browser.

Luminar

Also, a decent raw converter that can deal well with the tricky X-Trans files. I tested it on trial mode and I was quite pleased. No file browser, a shame !

Personally, I think it needs to evolve, but surely worth a try. 

It's cheap, reasonably fast and runs on Mac and Windows.


Bottom Note:

In the end, after some years using the X Series, the X-Trans matrix doesn't appear to make any huge improvement over a Bayer (without the AA filter) besides maybe some higher detail on high-frequency zones. But the Fuji's package is very powerful considering the price and final result. Still a solid option.

Fuji always liked to try some different approaches on sensor design, like the Super CCD and its numerous incarnations and I respect this. At least, like Sigma with the Foveon, they try.


ON1

It's a good surprise. Works fairly good with X-Trans files and has a decent file browser and friendly to sequential processing. The interface is quite good.
  

  

Saturday, May 12, 2018

KAPSA RED DOT

This is a fun and very cheap box camera made in Brazil, between the '50s and '60s.

Its bakelite body is well made and follows the old box formula from the early '30s without any surprises or significant technological evolution. It was made to be very cheap and easily usable.

Kapsa Red Dot

It has two waist level bright viewfinders, for landscape and portrait orientation, like many other similar ones. Those viewfinders are dreadful to use. 

The objective is a 110 mm two element achromat with a 3 position focus lever (1-2m , 2-8m and 8-infinity). The lens has a very primitive coating. When set at infinity the lens uses just the two main front elements, but when you set it for shorter ranges, a third element is put behind the shutter to set the focus point. Well, sort of...

The shutter is extremely crude. Just two settings: T and 1/100s and also three aperture settings: F8, F11 and F16. I would use it with ISO 100 color film or ISO 400 black and white film if you plan to use filters, like a Yellow or Green one. No idea of the filter size, but it's surely some sort of push on type.

It takes 120 films and can be used as 6x9 or 6x4.5 format. You can choose the format by flipping two metal masks.

Kapsa film chamber. Note the masks for 6x4.5 format

Well, don't expect a tack sharp image, of course!

Soft image, lots of chromatic aberration, rather low contrast, but fun to use. I bet it will give better images if used with black and white film.

Curious about the photos it can make ? Take a look !


Botanical Garden - Rio de Janeiro - Brazil
Camera: Kapsa Red Dot  Film: Fuji Xtra 400
Cheers !