What I Learned Today #1 - Sacrifices Have To Be MadesteemCreated with Sketch.

in science •  7 years ago  (edited)

As a chemistry student, I spend a lot of time studying at the moment as exam season is coming very soon. However, I am in the lucky position to have only two exams to write this semester. That leaves me with the opportunity to share some of my newly gained wisdom. Therefore, with this post, I start a series called "What I Learned Today", in which I will post an article about what I have learned today (obviously) and hopefully can give you an insight in what (physical)chemists, or scientists in general, do all day.

Thumbnail.png

Sacrifices Have To Be Made

Science is all about predicting the outcome of an experiment. Wether I throw a ball and want to know where it lands (physics), mix two substances and hope that the outcome can cure cancer (chemistry) or want to find out if a Great Dane and a Chihuahua can mate (biology). Side note: Yes, chemistry is the greatest of all :)

In my case, I am currently studying what I have to think about when designing a spectroscopic experiment. Spectroscopy is all about irradiating electromagnetic waves (e.g. light) onto a sample of atoms/molecules, whose characteristics are to be investigated. To do that, I want to describe my system consisting of my spectrometer, the sample and their behaviour using linear response theory. What this means exactly is not of importance right now, it just enables me to generalise the system I investigate and make predictions on future outcomes of experiments.

However, to be able to apply linear response theory, my system has to be time invariant (measuring today will yield the same output as will measuring tomorrow), homogeneous (irradiating twice the light intensity will yield twice the output signal) and additive (irradiating two light sources will yield the same output as adding the outputs of irradiating both light sources one after another).

Time invariance can be achieved by controlling parameters, such as temperature, very precisely and is not a concern in this case. The other two, homogeneity and additivity are more problematic. Homogeneity for example: Irradiating a light source that is one billion times more intense than another one onto a sample will probably not yield an output that is one billion times more intense. This is due to the fact, that there might simply not be enough molecules around to interact with that much more light intensity and a lot of that light simply passes through the sample without even having the chance to interact with a molecule in that sample.
You can compare that with an experiments of throwing darts at a wall with 100 balloons hanging from it while being blindfolded. If you throw 10 darts, you will probably hit around 8 balloons. Two of your darts will not hit a balloon, because you shoot them at a spot where the balloon has already been popped by a previous dart. This leaves you with a hit-chance of 80%. However, if you now throw 1000 darts at the wall with 100 balloons, you simply can not reach a hit-chance of 80%, as there are not even 800 balloons hanging from that wall.

Okay, therefore we "throw the shotgun into the grain" (intentionally bad translation of a German saying for giving up)? Of course not! We just have to find the range of light intensities where the correlation between input and output is linear (twice the input -> twice the output) in a good approximation. Going back to our dart - balloon analogy, we have to find a range, where the 80% rule still makes sense. This means, that we limit our experiment by limiting the number of darts we hand out. We do not want to hand out only one dart, as the hit chance is probably going to be 100% and we do not want to hand out 100 darts, as the hit chance will probably be lower than 80%. What we can do is, to just make an experiment by handing out every number of darts between 1 and 100, throwing them, and see in which range the hit-chance is approximately 80%.

Therefore we can set a range of light intensities for which we can assume the homogeneity and in a similar fashion also the additivity in our experiment and therefore use the linear response theory. This means, that we may have limited the number of use cases of our experiment, but we are at least able to make predictions.

What can we learn out of this?

First of all, we learn that as a scientist, we always have to make sure we understand the limitations of our equations and methods and do not try to use them in cases in which they simply do not apply.

Furthermore, we can also apply that principle in our everyday life. Trying to find the best solution to a problem may include making sacrifices. It is sometimes simply not possible to make everyone happy. Of course, optimising is great, but sometimes the cost are just higher than the gains. Writing a textbook about every single experiment there is, its strange behaviour in extreme cases and all the effects that take effect in it and force every student to learn them all by heart is more work than just giving a student general ideas and tools to tackle experiments and telling them that there are limitations to those tools and teaching them how to overcome them if necessary.


You have not had enough science today but are tired of doing it yourself? - Try out Gridcoin and earn some money!

Gridcoin is a cryptocurrency based on the BOINC network. Instead of producing hot air by calculating meaningless hashes, you work on real scientific workloads with your computer and get rewarded for it. You don't have the newest GPU-farm at home? No problem again! You can crunch with your CPU or your GPU and make a decent amount of GRC even with older hardware. Further information can be found here or here and there is even a pool that is pretty much free of charge here.

Authors get paid when people like you upvote their post.
If you enjoyed what you read here, create your account today and start earning FREE STEEM!