Topics

Concert on Beach: Results


Dan Margulis
 

I’ve posted the results of the Concert on the Beach exercise, the third in a series of 11 case studies.

Reviewing: This image was part of the MIT study. We are asked to assume that this is a promotional image for the Chamber of Commerce of this beach town, with a view to illustrating the type of cultural activities found here. It is understood that the weather is not cooperating for this type of advertising shot.

We have 38 entries. Most people also submitted a list of their steps, thanks very much. I haven’t read these, because I’d rather get a sense of who was successful and who wasn’t before investigating why.

The files don’t have people’s names on them, and were random-generator numbered from #301 to #338. As with past studies, we also have a “par” version, #339. To get it, I chose what I thought might be the five best entrants, and averaged them, each one weighted 20%. This often creates a version that is superior to most if not all of its parents.

Normally I don't comment on results for two days after they're posted, but I'll throw one observation out there now. These entries include one of mine from 2017 when I was comparing results with the MIT group. By rule, I couldn't make any selections. This time around, I had the liberty to make a major change in the color of the ocean, which I did, and wondered to myself whether anybody else would have the same idea.

I needn't have worried: more than two-thirds of us did so. Meanwhile, if you’d like to know how your own version stacked up, download the par version and compare the two directly. Do you think you got the same kind of quality? If not, I hope you’ll find further discussion useful.

The Folder is in the group's Photos section, Case Study 2021: Concert on the Beach,
https://groups.io/g/colortheory/album?id=260473

I also have zipped all 39 entries and uploaded a 68 mb file to our Files section,
https://groups.io/g/colortheory/files/
Search for 021521_Concert_entries.zip
If you are going to study these versions I strongly encourage you to download these files. Many of these entrants vary only in a minor way and it is hard to see the impact of a change without toggling back and forth between them.

I look forward to your comments.

Dan Margulis


Kenneth Harris
 

My par is 305 306 309 314 321. Mine isn't in there. I find 306 and 309 appealing, while 305 314 and 321 do a good job at keeping some sense of actual light. It's clear that not everyone thinks this photo should be carrying out the same task. Some are thinking to show exciting concert at the beach, others a record of time and place.

Are there any deep learning people is this group?

Ken Harris


Lee Varis
 

I wish I had thought of the sunset idea!


Dan Margulis
 



On Feb 15, 2021, at 9:19 AM, Lee Varis <varis@...> wrote:

I wish I had thought of the sunset idea!

No, you don’t.

You live in New England now, not California.

Those New Hampshire Yankees would have a big laugh at any silly furriners and left coasters who put that pretty sunset in!

Dan


Dan Margulis
 



On Feb 15, 2021, at 9:18 AM, Kenneth Harris <reg@...> wrote:


Are there any deep learning people is this group?

That question doesn’t parse, so maybe you could rephrase it. Conceivably I have a follow-up on it.

Dan


Kenneth Harris
 

Wasn't the point of the MIT study to generate data to feed AI to make pleasing pictures? I first came across the study when they put out an interesting paper on a possible smart phone app that attempted to speed the processing of larger images by bifurcating the data into two frequency ranges and processing separately and recombining. Thus these case studies would feed also into AI analysis, if I understand things right, especially GAN type deep learning.

Ken Harris


Dan Margulis
 



On Feb 15, 2021, at 10:12 AM, Kenneth Harris <reg@...> wrote:

Wasn't the point of the MIT study to generate data to feed AI to make pleasing pictures?

If I understood their rationale correctly it was rather quixotic. They wanted to use AI to produce pictures that were tailored to be pleasing to the particular individual. This is why they had five retouchers work through the 5,000 images, rather than 5,000 retouchers each working five, which might get a consensus about pleasing pictures generally.

I suppose that in principle this could be done but I would think that it would require that the person preparing the files for analysis be fairly skilled, since otherwise the results would get skewed by a large number of stupid errors that experience people don’t make.

 I first came across the study when they put out an interesting paper on a possible smart phone app that attempted to speed the processing of larger images by bifurcating the data into two frequency ranges and processing separately and recombining.  Thus these case studies would feed also into AI analysis, if I understand things right, especially GAN type deep learning. 

I’d suspect so, provided they were evaluated by enough people. That you or I think that a certain version is good doesn’t make it so. If it’s you *and* I it becomes much more likely, and if five other experts say the same thing, it’s probably true.

Dan


Kenneth Harris
 

"I’d suspect so, provided they were evaluated by enough people. That you or I think that a certain version is good doesn’t make it so. If it’s you *and* I it becomes much more likely, and if five other experts say the same thing, it’s probably true."

Enough people or the right people? Truly a question for these times, although I'd contend it goes back to the Gracchi brothers. My argument, perhaps too subtle, is once AI can generate popular pictures, it might be time to consider alternate strategies toward an optimal picture for the use. There are, of course, confounding variables to consider in training the AI, however, this all feels imminent.

Ken


Kent Sutorius
 

I think it's interesting how ground breaking their 2011 dataset would end up being. Bychkovsky Et al. ended up using Retoucher C as their model for the machine learning.  Numerous later scientific papers on automatic photo and tonal adjustments that use machine learning often test their models against the MIT model and/or use the dataset of images for their use. Bychkovsky Et al. were correct in their conclusion when they said, "We have built a high-quality reference dataset for automatic photo adjustment, which addresses a major need and will enable new research on the learning of photographic adjustment."

Kent Sutorius

On 2/15/2021 11:16 AM, Kenneth Harris wrote:
"I’d suspect so, provided they were evaluated by enough people. That you or I think that a certain version is good doesn’t make it so. If it’s you *and* I it becomes much more likely, and if five other experts say the same thing, it’s probably true."

Enough people or the right people? Truly a question for these times, although I'd contend it goes back to the Gracchi brothers. My argument, perhaps too subtle, is once AI can generate popular pictures, it might be time to consider alternate strategies toward an optimal picture for the use. There are, of course, confounding variables to consider in training the AI, however, this all feels imminent.

Ken




Gerald Bakker
 

Looking through all entries, I find the quality high. Most have convincing color and good contrast, no "disasters" as Dan calls them.
I like the idea of coloring the ocean, the question is how far to go. The original has a strong blue cast - correcting that makes the ocean almost yellow. I struggled with that, decided to add some blue, but didn't dare to go as far as many went. To be honest, I find the blue of the ocean in the par too much of a good thing. Somehow it looks incorrect under a sky that I think must be gray.

On the other hand, given that the correction is targeted for promotional purposes, a blue ocean may be the better choice.

My favorites are 306, 308, 323, 324 and 338. I prefer a lighter version and warmer colors. My own is 327.
--
Gerald Bakker
https://geraldbakker.nl


Harvey Nagai
 

Going back to last summer's case studies, this is the first par I don't find particularly likeable.

The audience is... just there.

The band is just there.  The bandstand, the beach, the ocean, the sky, they all kind of play well
together but nothing commands my attention.  When I look at the center of the frame I don't feel
compelled to look up to the stage or down to the audience.  It's... dispassionate?

I also find it hard to reconcile whites with a tangible blue cast in a sea of yellowness.

One interesting entry is 322, I don't agree with the colors (kind of brown), but it is very easy
on the eyes.  Replace its color with the par's and I like it more than the par, it has more
contrast in the stage area to differentiate it from the audience.

My entry is 314, yesterday I wasn't sure it was saying what I wanted it to say, today I think
it's saying exactly what I intended it to.  Whether it's a story that Hampton Beach wants to tell
or prospective visitors want to hear...


Bill Theis
 

mine was 328

Rather than be conventional I decided to try various blends of the LAB A and B channels with the L instead of blending in RGB.  This was a mistake from the point of view of introducing noise, especially any blends with the B into the L.  An experiment that failed, not to be repeated.  I did however submit a second version different only that I used Dan's Skin Desaturation action but I probably should have masked it in hindsight for the musicians on stage.  In general, though, I believe that skin tones in this version were less swarthy than they were in my first effort without desaturation.  Lastly I made the choice of being much darker in the audience to present the ambience of being in a dark auditorium, although mine was not the darkest.  Lastly I was quite concerned about the reds being way out of gamut, which is the primary difference between my color and par.  I didn't' get the incandescent yellow of the stage lighting which I like in the par.  I selected the ocean and made the color cooler but not as much as others, thinking it was getting on to sunset (which Dan points out is absent in New England).  I didn't correct the obvious distortion as then blending with everyone's images is problematic.  So artistic choices were made and not went well but I have a much clearer instinct now what to watch for. 

I am less colorful than the par version, which is much better than my effort.  It is great to take advantage of knowing the "answer" and repeating the correction as a learning tool


concerning the wide variety of submissions, my favorites are 321, 323, 335, 336 and 337 and they make a nice blend to compare to par


Robert Wheeler
 

In the late 1960’s, I spent two summers working at a Southern California beach food stand. Weather reports were mostly the monotonous “early morning low clouds and fog” with overcast conditions extending far into the afternoon most days. The area looked a lot like the starting beach concert picture, with dull light and dull ocean that had an indistinct separation from the dull sky, but without the blue color cast. I used the ACR filter and curves to reduce the cool cast and brighten things up. Warm tones on the stage seemed consistent with the lighting in the band shell. The ocean and sky required a mask excluding those changes to get a bit of color to persist, then used additional masked layers to gently increase blue/decrease yellow in the water. Had to decrease cyan saturation in the ocean to look more natural. Versions with dramatic blue/cyan water would not match California beach scenes and seem unlikely for New Hampshire (maybe possible in Maui or Mediterranean). Position of sun was discussed earlier. In Berkeley a few years later, I heard people chuckle about new students from the east coast who would party most of the night then drive up to park in the hills and wait to see the sun rise behind the Golden Gate Bridge (unfortunately in the west; oops). My submission is #305.


Hector Davila
 

Mines is 336.

I didn't see any color problems,
just brightness problems.

So I did a Curves White point
on the kid staring at the camera,
on his t-shirt arm.

Moving down his shoulder arm
t-shirt until it looks like
a bright sunny day to go to
a beach concert.

But then I realize the top half
needed to be selected to match
the bottom half (which a machine
learning doesn't see to do)

Then I threw all of PPW panel
kitchen sink at the whole picture.

Biggerhammer
H-K 50% Luminosity only
Color-boost 16%  /  minus red
Skin-Desaturation 19%
VelvetHammer 16%
Sharpen 2012  20%
DarkenSKY/ minus blue


Hector Davila


Frederick Yocum
 

I deliberately added some letters ‘finit' into my offering, 307, because the last time I entered I couldn’t tell which one was mine. I needn’t have bothered. I suspect the par version is a truer rendition of the light levels in the scene, but I had fun going over the top.

Dan mentioned something in the last exercise that peaked my interest, that flesh tones were better assessed using CMYK than LAB.  Are there other colours that benefit from reviewing in a specific colour space? 

I usually just stick with having the info palette show me LAB and whatever colour space the image is in. I use to review colors in CMYK quite a lot when my workflow ended in CMYK, but this was usually to check what numbers with which I was going to end.

regards,

Frederick Yocum


Doug Schafer
 

On Tue, Feb 16, 2021 at 03:48 AM, Frederick Yocum wrote:
."..flesh tones were better assessed using CMYK than LAB..."
I don't have access to Dan's books (they are at home and I'm on vacation). So now I'm curious...I now have Lab #s memorized and a handy cheat sheet if I forget....but I have no clue what the equivalent CMYK skin tone acceptable numbers would be; or what errant CMYK numbers would cause one to think a skin tone was wrong.

Dan, can you elaborate on what we should look for or not find when checking skin tones in CMYK?

Doug Schafer


john c.
 

Mine was 334 and decidedly colder than par and most others, attributable to my slightly less than 5000k calibrated monitor and my aging eyes, but mostly my thoughts about the color of daylight I preferred for the time of day I wanted this to be. I added a curves layer to warm it up (slight red boost and negative blue, with a slight luminosity darken) and I really like it, so much so that I blended it with the par version, which I believe it helps a great deal. IMHO of course.


Hector Davila
 

Mines is 336.

My goal was to sell this concert
by giving it a bright sunny day.

(nobody will attend a concert if the photo looks grey with a chance of rain.)

So I did a Curves White point
on the kid staring at the camera,
on his t-shirt arm.

Which made it brighter and sunnier, and the colors stood out more.

(if the white shirt looses  details..that isn't important.)

On a personal note..when I was a kid I was trying to decide between buying a Cannon camera or a Yashica camera.
I decided on the Yashica Electro 35 because the prints were warm colors.

Hector Davila

On 2/15/2021 3:07 PM, Hector Davila wrote:
Mines is 336.

I didn't see any color problems,
just brightness problems.

So I did a Curves White point
on the kid staring at the camera,
on his t-shirt arm.

Moving down his shoulder arm
t-shirt until it looks like
a bright sunny day to go to
a beach concert.




John Furnes
 

Hi,

 

Mine is #331.

After seeing all entrants and the par-version, I realise that I have been too moody. The overall impression is grey and lacklustre. There is a violet cast over it, and all the warmth is gone, even though at NH in July, the sun sets around the time this picture was taken, and there should be quite some leftovers of the sunshine.

I am not happy with it.

I also think that the par-version is a bit blue.

The only one I think is better than most is  #327

 

John Furnes


David Remington
 

My version is 321.

My goal was to make the scene inviting while remaining a believable photographic representation of the conditions. One that would fit with how you might see the scene in person or remember it later. To draw attention to the performance I kept the warm feel of the spotlights and added some additional contrast and saturation. I wanted that area to balance with the colorful spectators.

Looking at it with fresh eyes, and in context of the other submissions, I would go a little warmer (but not sunny day warm), add more contrast to the water/sky (but stay with the hazy look), and make the band area pop a bit more. I think that spot should draw the eye.

David