The CO2 Greenhouse Effect on Climate: Is the Sun Cooling?

Background

So for the past 2 months or so, we’ve been working on modeling the Earth’s climate as accurately as possible. The latest version of our JavaScript calculator (v1.4119) is now posted online.

In the initial stage of this project, we used the following formula (from this simple model) for greenhouse effect (clear-sky):
εA = 1− exp(−0.4634∙exp(18.382 − 5294/T) − 3.52∙CCO2)

This is simple to understand, in principle – there is an exponential variation in the emissivity of the atmosphere with respect to vapour concentration (H2O, which depends on dew-point temperature) and CO2 concentration (which is presumably constant globally). The remainder (clouds, dust, ozone, etc.) is considered constant here, which I consider a flaw, and we’ll see later what is actually constant and what isn’t (clouds are definitely not!).

Problems and Solutions

This simple model overestimates the effect of CO2 and water vapour on the full greenhouse effect, due to the way the exponential functions are put together. Every version of our models that uses this formula, however, for some reason as yet unknown, behaves inversely – CO2 causes cooling. Obviously, something is inverted. But, inverted, the formula produces a runaway greenhouse effect that would make Venus look like an ice-ball. We might be applying it wrong, but ultimately this formula seems to be flawed (or at least limited to a small range of temperatures).

Enter the new greenhouse effect formula, based partly on Schmidt et al
gamma = (1+co2)*(1+h2o)*(1+others)*(1+ozone) – 1

Here gamma is the greenhouse effect (downward flux forcing) as a fraction of solar flux at the surface (Is). Thus, to calculate the surface temperature, all we have to do is:
Ts = fourth-root ( Is * (gamma + 1) / ( eS * sigma ) )

Where Ts is the temperature at the surface, eS is the emissivity of the surface (usually 0.9 dry to 0.96 wet), and sigma is the Stefan-Boltzmann constant of 5.67e-8.

The result should be close to 288K (15°C) at gamma = 0.654, Is = 237 W/m2 and eS = 0.96.

But the question is, what is gamma, physically? And how do we derive it from basic concentrations of various gases, each gas having a different proportional forcing effect?

Well, we have to use the actual absorption bands (spectra) of each of the gases. It turns out that water absorbs up to 31% of Earth’s emitted radiation at 8000 ppm, and CO2 up to 14% at 8000 ppm. The concentration is important because it affects our optical depth. Less concentration means a greater transparency. The ppm values are based on molar concentration (i.e. n mol CO2 / n mol of air = conc of CO2 in air). 8000 ppm is high enough concentration that we have nearly 100% of the effect that we’ll ever have.

Therefore, we have this simple code:

var co2conc = co2level;//ppm
var vapconc = Math.exp(18.38-5294/Math.max(dewpoint,205))*10000;//ppm
var lapseAvgCO2 = 0.875; //87.5% or up to 70 km in atmosphere is a VERY SLOW lapse rate! (is this correct ??)
var lapseAvgH2O = 0.07; //7% or up to 5.6 km in atmosphere
var co2 = 0.14*(1-Math.exp(-lapseAvgCO2*10000*co2conc/1e6)); //0.7% at 15 ppm to 14% at 8000 ppm;
var vap = 0.31*(1-Math.exp(-lapseAvgH2O*10000*vapconc/1e6)); //1.5% at 210K 15 ppm at 0.02 mb to 31% at 285K 8000 ppm at 13.5 mb
var others = 0.065; // aerosols, CH4 (clouds included in vap)
var ozone = 0.036;
return (1+co2)*(1+vap)*(1+others)*(1+ozone) - 1;

Note – the lapse values are used to correct the concentration to the atmospheric lapse rate of the gas. That is, at height H1, what is the concentration with respect to height H0 (sea level)? It turns out, with CO2 we have roughly the same concentration up to 70 km, then we drop off sharply. So we still have 380-400 ppm all the way up to the stratosphere. Using the lapse rate of the air, it is possible to calculate the equivalent concentration of air at altitude, arriving with a height of 54.7 km where the concentration of the air matches that of CO2. This is how we determine our “scale height” of the CO2 column. The same principle works with H2O, where it turns out the lapse rate is much more abrupt, leading to nearly zero concentration at 5.6 km (instead of 70 km for CO2). You’ll notice most places on Earth at 5.6 km altitude don’t have clouds or precipitation, so we are quite accurate. Based on this, we can conclude that most of the H2O greenhouse effect occurs in the lower troposphere, so adjusting to the entire column height of air, we get a lapse value for concentration of 7%. This means 8000 ppm at sea level is in reality 560 ppm across the entire column due to this sharp lapse rate.

You’ll notice the numbers 56 and 57 a lot, especially if you keep in mind that the solar temperature is around 5600 Kelvin to 5700 Kelvin and that the solar radiation wavelength is around 560 nm to 570 nm. All of this points to a potential holographic fine balance (a sort of numerological conservation) of mass-energy-information within the Earth-Sun-Moon system. But alas, we digress…

Conclusions

The model in this post suggests the CO2 height in the atmosphere has dropped in the last century, from about 57 km down to 54.7 km, a distance of about 2.3 km. This is no small feat, and indicates one of two things: (1) there are greater net emissions of CO2 causing the overall total mass of CO2 to increase, or (2) the total mass of CO2 has remained roughly the same, but the sun’s radiation at stratosphere heights has been cooling.

If the sun were cooling, things would be quite interesting. We would indeed observe a greater apparent sea-level concentration of CO2 simply by virtue of the fact that there is less energy available to heat and dissipate the CO2 molecules. Furthermore, this concentration would also increase due to the reduced excitation at the edge where the CO2 layer meets space, especially since less UV radiation would be hitting the CO2 molecules directly. In other words, today’s perceived high CO2 concentration may be due to the fact that the sun is emitting less shortwave radiation and more longwave radiation, equivalent to a Doppler red shift, or a cooling of the sun‘s radiation.

57 km is way above the ozone layer, where all three types of UV radiation come in. CO2 has absorption bands in the ultraviolet range, which means it should theoretically heat up and even leave Earth’s atmosphere and go out into space at rarefied concentrations. It would not cool down and freeze, since its partial pressure is so low. So obviously the increase in CO2 concentration is particularly troubling once you realize how it is connected to the energy level of the sun.

Long story short, if the sun were emitting longer wavelengths, at a lower temperature (say, 5500 K), we would notice that as an increase in apparent CO2 concentration and an even greater increase in the greenhouse effect. Even our regular daylight would change, to have more reds and yellows, and fewer blues and violets, and even less ultraviolet. With less ultraviolet available to break O2 into O3, we would also see a reduction in the total level of ozone directly related to the sun’s longer wavelengths. Thus, the ozone hole, which appeared since the late 1970s would also be connected to the rise in CO2 and the cooling of the sun.

Eventually, the cooling of the sun should overpower the greenhouse effect, which is relatively weak on Earth, leading to a substantial cooling. The magnitude of that cooling remains to be explored in a future article.

Sun Calculator

Dan’s Sun Calculator is the latest addition to Dacris Software’s repertoire of curious and mildly entertaining software. This is v0.1 alpha surely, as it is missing lots of features, listed below:

Notes of use (i.e. missing features):
- Temperature is based on dry air with no variation based on cloud cover or wind or precipitation and does not take into account the typical 42-day delay in heating/cooling based on air’s heat capacity
- As such, there are slightly greater extremes, though not as extreme as one might expect
- Oceans have even greater delay for heating and cooling, usually measured in multiple years, obviously this is not taken into account
- The arctic region has a significantly higher albedo, but this is not taken into account

The radiation figure is accurate, but the variation due to Earth’s elliptical orbit causes about 7% total variation annually. This is not taken into account (yet).

Interesting would also be to simulate the effects of global warming. Of course, that would require accurate simulation of wind and precipitation patterns, which would require a topographically accurate map as well.

Windows 8 and Benchmarks

This just in: Windows 8 banned by HWBot (Extremetech)

A serious flaw has been discovered in Windows 8′s RTC (real-time clock) mechanism. If the system clock is changed at all from the default, it turns out that the system clock will run at a different speed. Most PC benchmarks expect the system clock to run at the same speed. Dacris Benchmarks 8.1 uses the RTC and assumes that the system keeps accurate time during the test.

The effect is demonstrated in the following video:

We therefore cannot guarantee accurate results on Windows 8, as we would on Windows 7 and earlier OSes.

If you are running Dacris Benchmarks on Windows 8, you should make sure your results are accurate by using a timer along with your system clock to make sure you are in sync as in the above video.

Follow-up: RTC Issues Blown Out of Proportion

Dacris Benchmarks 8.1 Now Freeware

March 6, 2013 — FOR IMMEDIATE RELEASE

If you are still testing the performance of your computer in 2013, Dacris Benchmarks is now available for free, with a freeware EULA.

So if you are getting a PC in 2013, Dacris Benchmarks is your tool of choice for evaluating essential performance metrics.

Download, share, and enjoy!

RAIN 2102 – PDF Version Now Available

Alfred, a serious career programmer in Upper Internet, sees his life fall apart gradually in the year 2102. Will he end up a lowly Shaman Warrior? Will he reconcile with his wife, Marla?

How will he get his life back on track?

Or maybe you’re just interested in how technology changes society in the future.

Find out in the first release of the novel series “RAIN”.

Order today (PDF eBook)!

Email contact at dacris.com if you’d like to request a free copy.

Preview

It was raining again. Alfred walked out of the Eternal Equinox in disgust, little realizing that he had left his coat in his car. They were charging $150.00 for a beer, double what he had paid the week before. After six (or more) beers and no success, Alfred’s frustration finally spilled over in the form of complete outrage at something seemingly so trivial – the price of beer. He yelled: “Oh God! Why do you hate me so much?! WHY!? I can’t get lucky? Not even once? There’s NO WAY I’m paying $150.00 for a beer!” Alfred had quite a bit of money left, over $20 million in fact. He had saved it up by working long hours on clandestine projects for even more clandestine clients. His latest incarnation – the history simulator (or “Looking Glass” as he preferred to call it) was an ambitious attempt to re-create world history, in the most minute detail, in the form of a computer simulation that would display all of the variables in real time and allow the “visitor”
(who would be able to enter this virtual world by using a consciousness-portal) to observe and alter history by “possessing” the mind of an unsuspecting subject. Now, this being virtual, there were no real ethical concerns like the grandfather paradox. (Or were there?) Of course, there was always the question of what would happen when Alfred managed to successfully reproduce history with 100% (perfect) accuracy. Alfred was not about to let his client get his hands on the full unadulterated version of his “Looking Glass” project. In a peculiar way, Alfred had a conscience (yes, morals) and suspected that the project could be used for unimaginable evil.

Windows 8, Windows RT, Microsoft Extinction Event

Microsoft began its Windows business in the 1980s by selling the software (the operating system) that IBM’s hardware badly needed. This was the kernel of business that would eventually become Microsoft. As the PC (IBM PC at first) began its proliferation, growing in popularity exponentially, Microsoft began its growth as a software company. Sure, there was the enterprise segment which Microsoft tried to win over multiple times throughout its lifetime, but it was almost always dominated by UNIX variants. Both users and developers tolerated Microsoft’s environment as time went on, with Windows XP being the most successful app environment in terms of longevity.

In 2001, Microsoft bet the company on .NET. Let me remind you that this is now 2012, 11 years later. Windows Vista was the first OS to integrate .NET. The migration path for developers was clear: embrace .NET. Windows 7 wonderfully integrated the new .NET Framework 3.5, and now .NET developers could finally expect to build real desktop applications on Windows. Of course, there was also the legacy C++ (Win32) environment that had to be supported, which 99% of Windows apps still rely on, but that was obviously not a problem for anyone at Microsoft. Clearly, .NET would be far more successful as an API than Win32 ever was.

The year was 2011, and ARM processors began to worry Microsoft. There was only one time in the past when other architectures (besides x86) posed any real threat to Microsoft and this was in the early days of Windows NT, in the mid 1990s. But this time, the threat came from a plethora of non-PC devices called “smart phones” or “tablets” or “netbooks”. Not only are these devices extraordinarily cheap, cutting into Microsoft’s margin on sales of its OS, but they are also growing at a much faster rate than Microsoft’s flagship platform, the PC, on which its Windows empire was built. In 2012, you can get yourself a top-notch 2007 workstation in the form of a tiny ultra-thin laptop, and the price of the OS (Windows) represents at least half (50%) of the price of the device itself.

Things have changed, indeed, and Microsoft’s OS monopoly is now threatened by changing consumer expectations, and a changing hardware environment that no longer resembles the slow, homogeneous pace of change of the 1980 – 2005 era dominated by the desktop PC form factor.

It can be argued that Microsoft can expand into the enterprise sector to survive, but this sector requires a stable API and a secure OS. It requires the scalability and flexibility provided by open source software like Linux. It will be tough for Microsoft to penetrate very deep into enterprise, and this is exemplified by its inability to even gain a foothold in the cloud computing sector. Microsoft was never a successful hosting company, and I doubt they ever will be.

I urge the reader to take a look at this article: “Microsoft’s Extinction Event” written by Mike James over at i-programmer.info. New anti-Windows RT articles are appearing every day, and I agree in principle that Windows RT is a horrible step backwards for Microsoft, and a step away from its .NET strategy. It’s a classic case of “couldn’t leave well enough alone.” I have been calling for the end of Microsoft since April, 2009.

I have no doubt that Microsoft will survive for at least one more decade, but it will be in a greatly diminished form in terms of profits and market cap. The latest move with Windows RT and Windows 8 will cost them even more. It will cost them reputation, heavily. Instead of simply optimizing the performance of their already successful Windows 7 OS, they decided to diverge once again, trying to grab onto ARM market share at the expense of their traditional PC market share. The result will be catastrophic, as their PC clients will simply refuse to buy the new OS.

I really wonder how OEMs will handle Windows 8. Will Microsoft once again force OEMs to push an inferior OS like they did with Vista? It should be interesting to see how things evolve, but I suspect the strong demand for Windows 7 will, if MS is reasonable at all, cause a recall of Windows 8 for at least 2 more years. WinRT and .NET 4.5 are not small changes, and there are many things wrong with the way they are being deployed in Windows 8, such as the limited backward compatibility with .NET 4.0 or 3.5, that should raise eyebrows. They represent an unneeded “kick them while they’re down” moment for Microsoft’s developer community, and I for one will not stand for this.

Knowing that Microsoft is a company in decline whose market share will diminish over time is important for any developer, but especially for those who have grown up with Microsoft technologies and have never experienced much of anything else. As a developer, I have been diversifying into Node.js and other open source server-side technologies. On the client side, I’m pinning my bets on HTML 5 and JavaScript as being the dominant unifying force for the next few decades. SQLite will be a growing trend as well. I have a 20 year technology vision since 2010. I have 12 years of experience now, going back to when I worked with Win32 API and C++. These technologies will not die. COBOL never died. Neither will .NET, yet. But it is now seriously time to divest, away from Microsoft.

One last thing to think about, because it is 2012, is how Microsoft will support its older .NET frameworks or its older Windows versions, including all the variants of Windows Server that have been released just since 2006. How will they support Windows Azure and Windows RT and Windows Phone and Bing, all of which never existed prior to 2009? Think about what all of this means for a company whose bottom line will decline. Going out with a bang, perhaps? Only time will tell, but I’m hedging my bets, and so should you. Learn Linux, intimately.

RAIN 2102. Preview

It was the first annual directors’ meeting. Alfred was invited to come, for the first time in his career. He was very excited to see what the directors were doing, knowing full well that this was the innermost circle of elite in Upper Internet.

The building was a few miles outside Las Vegas, in a secret military base called M2.
The guest list was filled with important people in politics and business.
John Buffett, Warren Buffett’s great grandson, was there, along with the recently announced inventor of the telepathy implant, Michael Mitchell.
The first annual meeting was always an important event. Alfred was told there would be some “special guests” arriving.

Voice on Microphone: “Ladies and gentlemen, welcome to the first annual directors’ meeting for AD 2102. Our first guest speaker tonight will be none other than Michael Mitchell. If you haven’t had the honour …”

The introduction went on for 4 minutes. During this time, Alfred kept thinking about his project and how he was going to explain, at the board meeting next week, that his history simulator was set back by 6 months. Obviously, “my dog ate it” was not going to work.

“And now, without further adieu, I give you Michael Mitchell!”

The crowd began to applaud loudly. As he looked around, Alfred began to feel a strange sensation, as if everyone was not quite natural. He couldn’t quite put his finger on it, but the applause did not quite sound natural. It seemed too fast, and too well entrained.

“Thank you, thank you.” began Mitchell. “It’s truly an honour and a privilege to be among you tonight.”
“As you all know, the telepathy implant is the kind of invention that enables us to move forward into the 22nd century.”
“Our contact with the Lyrans could not have been possible without this implant, as you all know.”
“This will soon enable us to decipher the meaning of every extraterrestrial language, and translate between any two languages, by applying the latest holographic translation software supplied to us kindly by Google.”
“But this is a vision that will still take a few years to implement. We, at Telepathica, are working on other projects as well, and we’d like to share one of these with you tonight.”
“To do this, I will invite on stage Jack Carpenter, research coordinator at Telepathica, who is probably very excited to show you what his team has been working on.”

Announcing “RAIN 2102″

THIS SUMMER

THIS SUMMER

THIS SUMMER

A novel about social media, programming, psychology, and the future, is coming.

It is in the works now. Expected release date (for pre-ordering): September 15, 2012.

Preview:


It was raining again. Alfred walked out of the bar in disgust, little realizing that he had left his coat in his car.
They were charging $150.00 for a beer, double what he had paid the week before. After six (or more) beers and no success,
Alfred’s frustration finally spilled over in the form of complete outrage at something seemingly so trivial – the price of beer.
He yelled: “Oh God! Why do you hate me so much?! WHY!? I can’t get lucky? Not even once? There’s NO WAY I’m paying $150.00 for a beer!”
Alfred had quite a bit of money left, over $20 million in fact. He had saved it up by working long hours on clandestine projects for even more clandestine clients.
His latest incarnation – the history simulator (or “Looking Glass” as he preferred to call it) was an ambitious attempt to re-create world history,
in the most minute detail, in the form of a computer simulation that would display all of the variables in real time and allow the “visitor”
(who would be able to enter this virtual world by using a consciousness-portal) to observe and alter history by “possessing” the mind of an
unsuspecting subject. Of course, this being virtual, there were no real ethical concerns like the grandfather paradox. (Or were there?)
Of course, there was always the question of what would happen when Alfred managed to successfully reproduce history with 100% (perfect) accuracy.
Alfred was, of course, not about to let his client get his hands on the full unadulterated version of his “Looking Glass” project.
In a peculiar way, Alfred had a conscience (yes, morals) and suspected that the project could be used for unimaginable evil. But perhaps he was just selfish. Or demonic.
Either way, as Alfred came to grips with the cold rain hitting his ugly balding head, he realized that he had wasted all of his time and money, again.
He began to run, slowly at first, then frantically, through the rain. Screaming, at the top of his lungs, something unintelligible, he ran and ran, and ran.
Back home, Alfred collapsed onto the couch. The world began to spin. The phone began to beep. Beep-beep. Beep-beep. Beep-beep. …

Online Ordering Revamp

The ordering system is down at the moment. We are aware of this issue and we are working to fix it. If you’d like to place an order, please send an email to contact@dacris.com and someone will assist you.

We expect to have the new system up & running in a few weeks.