Building a Copenhagen Interpreter


Introduction: Building a Copenhagen Interpreter

About: I publish my failures and my successes, as my teachers have done before me. I am a member of Foulab, an independent, nonprofit research and engineering group in Montreal. Check out our webpage at

Disclaimer 1: The following represent preliminary results. I have yet to perform a proper statistical analysis. I myself am convinced that what I write here is purely speculative until I set up proper data collection and perform the requisite mathematics. Please refrain from telling anyone that anything presented here is "fact" until I can defend such a statement.

Disclaimer 2: This project is not safe for a wide variety of reasons. If you insist on attempting it, please observe adequate safety procedures. I am not qualified to recommend anything specific, so please perform extensive research. Your safety is your responsibility.

What I am describing here are my attempts to demonstrate the Copenhagen Interpretation of Quantum Mechanics. This does not prove other interpretations as incorrect, it only proves that the Copenhagen Interpretation is useful for explaining the behavior of this device. For lack of a better name, I name this device a "Copenhagen Interpreter". Rather ironically given it's name, if successful the device will produce nothing but provably inutterable nonsense.

The Copenhagen Interpretation was developed by Bohr and Heisenberg. Simply put (by wikipedia):

[It] rejects questions like "where was the particle before I measured its position" as meaningless. The measurement process randomly picks out exactly one of the many possibilities allowed for by the state's wave function.

What I will try and accomplish here is to build a "small" device that measures a system which may exist with some probability as a number of discrete states. Further, the state the system will exist in at time "N" cannot be predicted even given perfect perfect knowledge of the system, infinite computing power and infinite time. In other words, if the source can sustain maximum possible entropy, this device will be a nice demonstration of the Copenhagen Interpretation at work.

Being mortal, I don't have perfect knowledge of the system, infinite computing power or infinite time. What I do have is statistics, which is as close to any of these distasteful things as I care to get.

For fellow stats geeks, I'll be using P=0.01 throughout this experiment.

Step 1: Entropy Source

To begin we need a source of entropy. Not just any source... flipping coins, rolling dice, or observing the behavior of certain celebrities is not sufficiently random. Coin flips, dice rolls, and celebrity drug rehab are what we might call "large systems", which the Copenhagen Interpretation suggests are best described by classical physics.

The source of entropy for our purposes must be "small".

A source where quantum tunneling occurs seemed to me like a good place to start, as it only occurs on a small scale. As a consequence of the Heisenberg Uncertainty Principle, we cannot know for certain both where a particle is, and how fast it is moving (Note: this will be far from a complete treatment of the topic). As the measurement error decreases with respect to velocity, it increases with respect to position and vice versa. Therefore, if there exists a particle and a potential well, we cannot be certain the particle is within the well while being certain that it is traveling below the escape velocity for that well... it's all a matter of probability. (My reasoning may be incorrect here: please correct me if so)

Alpha decay is caused by quantum tunneling. If we have some macroscopic amount of an alpha-emitter, measuring alpha decay meets our criteria for a "small source", since it represents an individual atom decaying. Furthermore, knowledge about the particles within an atom useful in predicting alpha decay cannot be determined due to the nature of quantum tunneling vis. The Uncertainty Principle... So even given perfect knowledge of the system, and infinite time, you could never do better than pure chance when trying to determine the time of the next measured decay.

So, we will use a 0.9 microcurie sample of Americium-241. It's readily available, legal to own in my area, and not likely to kill me.

Step 2: Detector

The detector was a source of notable frustration. I didn't want to use a Geiger tube, as it would require high voltage equipment I don't have.

I had heard of solid-state detectors based upon the principle that a reverse biased photodiode can detect incident particles. I decided to try it out, expecting it to work at the very least with the few 60KeV gamma rays the source produces.

I was in for a surprise: I couldn't get anything to work. In theory, incident particles would strike the depletion layer of the semiconductor as increased by the reverse bias, transferring enough energy to the silicon lattice to knock some electrons out of the outermost electron shell and into a new shell (called the conduction band). The device would then allow a small current to pass through for a very short time. I managed to get a lot of nothing out of it, even after connecting the signal (or lack thereof) to a 3-stage MOSFET amplifier.

In frustration, I disconnected the power. It then condescended to produce a signal...

What I think I was observing is a variation on the photoelectric effect. All PIN photodiodes act as small solar panels when exposed to natural light. Natural light has very little energy per photon compared to our alpha particles... which in our case are around 5 MeV. I had by chance removed the glass window on the photodiode to ease particle capture, and the system was sealed against light... so it seems consistent. An identical photodiode without the source produced only a signal of 60Hz interference.

In the photodiode with the source attached, there was what appeared to be random variation in the 60Hz interference. Furthermore, I estimated the frequency of these variations to be 15,000-50,000 counts per minute, consistent with the strength of my source.

Finally, upon careful repeated observation, some of the voltage peaks on the counts were rarely roughly double in size... which is consistent with random data as opposed to nonrandom waveform.

I hesitate to make any decisive conclusions, but I may be detecting alpha particles basically using a glorified solar panel. As far as I know, this technique is undocumented. If it does represent a novel technique, I explicitly release this detection technique under the GPL.

The first photo is of a single "event" measured at 200 microseconds per centimeter. The second photo is on a much longer time scale... 1ms/cm if I remember right. There is unfortunately quite some noise, but you can still see there are some very stochastic deviations in the noise. One of the events is quite a bit bigger than the rest, and may indicate the coincidence of two events. It took some effort to find such a thing, as on longer timescales the events are hard to see. The giant spikes are artifacts of my aging but indispensable oscilloscope.

I should restate that these results are very preliminary. It remains that I run the experiment with better shielding to eliminate the noise. Then, I'll need to amplify the signal, automate data collection with a microcontroller... and finally test the probability of true randomness... or determine information density if you prefer such terms.

I have an idea or two about practical uses of this device if I can confirm this effect.

Step 3: Additional Info

Here's a small comic which demonstrates how the detector was originally supposed to work. I have no illusions of being good, or even acceptable, at drawing.

What actually happened was not a sudden change in the conductivity of the photodiode silicon lattice, but a short-lived buildup of charge as demonstrated by the oscilloscope photographs, akin to the photoelectric effect.

Reverse-bias is *supposed* to improve event capture by expanding the depletion layer, and cause the charge buildup to be shorter lived (but still visible), improving performance... but I could not conspire to get such a thing to work.

If the correct components find me, I will try to repeat this experiment with them in an attempt to cheaply increase sensor area, as well as render the experiment in its entirety more easily reproducible... the current photodiode is a Hamamatsu S1223-01. I have some left over from a failed experiment involving single-photon detection. A very nice product, however rather expensive for a hobbyist.

Step 4: New Developments

None of my projects is ever truly "completed". This section will describe new developments as they occur.

The only thing I have to report so far is that I borrowed a 1L steel paint can from my workplace (we give them away anyway), and used it as shielding from electromagnetic interference. I hardly had to do anything more than shove the detector/source inside, and use the can as the external oscilloscope ground.

The results were immediate and drastic. The 60Hz interference as well as the (mysterious) ~200Khz interference I always pick up are apparently completely eliminated by this trivial modification. There are nice voltage spikes visible, and look exactly as they should.

I find it difficult to believe that this is actually working, but it seems fortune has favored me today. My next step is to collect a lot of data and perform statistics which I will publish as a separate instructable. Finally, I'll get this working with components that are more easily obtainable.

Our first image here is the entropy source the modified photodiode at a "medium" sampling rate, encased in the new EMF shielding. The extremely rapid oscillations present in two parts are artifacts of my quirky oscilloscope. The voltage spikes followed by the apparent logarithmic decays are our quantum incidents.

Our second photo is similar, but at a higher sampling rate showing the individual incidents at a better resolution. There just happen to be 4 in a row in close proximity... this was not commonly observed, but made for a cute photo.

Our final photo is a pseudocontrol. It is an unmodified photodiode encased in the same shielding, with no entropy source. The scope settings were preserved as compared to the previous two photos. As you can see, there is a lot of nothing happening!

I say "pseudocontrol" since it is an unmodified photodiode. The modification consisted of removing the glass window protecting the semiconductor... and I do not immediately feel like either disassembling my working detector or getting covered in ground glass again carefully removing the window.

August 5th: I have ordered a USB oscilloscope with 60Mhz bandwidth... not only does this outclass my previous scope (which I will miss), it allows for automated data collection, stored as a txt file. I have a SAS UNIX license for academic use, statistical analysis will follow.

Sept. 11th: It seems that my new scope falls short in one area. While it has higher bandwidth and more features than I can shake a stick at, it does not seem to pick up weak signals (under 10mV) as well as my old scope. I will need to build a MOSFET amplifier before I'll be able to see anything interesting. Unfortunately, I've committed to several other projects which I must see through before I'll have time to work on this again.

November 25th: I picked up a new ancient scope with better sensitivity and 5Mhz bandwidth. I can once again observe the effect, and while I do not have a digital storage function on this scope, it should be sufficient to resume work on this project. On this scope, the logarithmic voltage decay curves caused by particle impact are more apparent, due to the higher bandwidth I assume (5Mhz instead of 2Mhz)... no more jagged lines!

Step 5: Things to Come...

November 28th (and some of the early hours of 29th): Based on the data from the 5Mhz scope using a new, larger Faraday cage, I designed an open-loop (130dB gain) differential amplifier circuit for use with the device. The whole thing has to run within the Faraday cage or noise will render the whole circuit inoperable. Everything worked on the first try (Which is quite abnormal), and now I can see voltage spikes for each particle impact. They are a sudden increase of 20mV over around 2 microseconds, followed by a perfect logarithmic decay over around one millisecond.

The results of this experiment, as well as the goal of all of this work will be published in a new instructable within the next few months.

For now, I leave you with the readout from a Hantek DSO-2150 USB 150 megasamples/sec scope, that has been amplified using a differential amplifier based on the OPA 2132... It shows the impact of 1 or 2 helium nuclei at ~5MeV, overlapping. Note how they produce peaks of the same amplitude, suggesting two particles of similar energy hitting the detector. It is also (remotely) possible that one particle had an elastic collision, and then hit another silicon atom in the detector a second time within 20 microseconds, or that there was some sort of secondary emission, but in this case we would expect it to be unlikely because the two peaks have similar amplitude.

The particle detector meets all design goals. It detects the incidence of alpha radiation with excellent resolution and low noise. The next phase of the project is data acquisition, statistics, and the construction and programming of a microcontroller that interprets this data in such a way that a computer can make use of it. This will be called the "Quantum Entropy Accumulator", since... well... it accumulates entropy from a quantum source. Forgive my lack of originality, I'm tired.



    • Water Contest

      Water Contest
    • Creative Misuse Contest

      Creative Misuse Contest
    • Oil Contest

      Oil Contest

    29 Discussions

    I once had to build a low-pass filter and pre-amplifier for an instrumentation project. I hung my device onto the A/D card and got a mess of high-frequency noise. I thought that the coax cable from instrument to pre-amp would be noise free, but I was wrong. When I hung the pre-amp/filter off the instrument and sent the amplified signal over the coax, the noise was negligible. The take-home, I guess, was to pay attention to signal paths. Besides shielding the pulse-generating circuitry in a cage, it sounds like paying attention to all signal paths and sources of noise may help. Are you using bypass capacitors on the V+ pins as well as being generous with ground planes? I'm not an EE, but I know there are guides on the websites of providers of instrumentation ICs, like Texas Inst., Analog Devs., and so forth. Good luck.

    1 reply

    I remember using 220uF and 0.1uF power supply bypass capacitors. The ground plane for the open-loop amplifier stage is quite generous, and miniaturized as far as the components would allow. In later revisions, the preamplifier, diode, and particle source were made into a module about the size of a US dime and encased in grounded copper foil. This gave the best performance, and looked doubleplus cool.

    I return to work on this project from time to time, the most frustrating part is the cost of the diodes... 25$ each, and you get a scary phone call at home asking why you need milspec parts (basically, just answer honestly). I've tried different exposed die semiconductors as replacements (MOSFETs, power transistors, diodes) without success. I still have a few left to try based on older "budget" particle detector designs documented by some universities (updated with modern tech of course).

    Really I'm surprised that an exposed power transistor didn't work when I tried it. It should work, must be doing something wrong.


    A couple of suggestions for reducing noise: Use an active filter to remove the 60Hz band from your signal. It looks like the pulses of interest are much shorter than the 16.67 ms period of 60Hz. So a high-pass or notch analog active filter could be included in the amplifier chain. Also, does the avalanche diode detector become quieter at lower temperatures? If so, could it be cooled by, eg., a peltier device?

    1 reply

    Well, you're right! In other designs I used a twin-T notch filter for this task (active or passive). Due to the high amplification though, other sources of noise can be problematic too (serial comms, nearby switching power supplies)... so when I designed this some time ago, I decided that a Faraday cage was the best option. Another (smaller) problem is that signal frequency and pulse duration can vary by quite a bit (multiple hits at the same time, maybe also Bremsstrahlung), though it's still not near 60hz. I think pulse duration is a bimodal normal distribution (2 normal curves), and frequency is a poisson distribution... so it involves some interesting math.

    Avalanche photodiodes are expensive, so I didn't use one... but yes, you can cool them to decrease noise, and (if you cool them enough) even use them as single photon detectors. This is actually a normal (but large surface area, small capacitance, and milspec) PIN photodiode.

    That's where I was three weeks ago, my friend. Then, a colleague told me that "no one can really conceptualize quantum physics". I firmly believe that with due diligence, any aspect of reality can be understood by just about any human being. This project was ultimately inevitable. Mind you, this is still all rather confusing to me too. I have not proven my colleague wrong just yet. Wish me luck ;)

    Here's where I can't buy it. <a rel="nofollow" href="'s_cat">Schrödinger's cat</a>. Put a cat in a box with a quantum particle (which you can't be certain about) that is a trigger to a device that will kill the cat if it decays, and close them up in such a way as the cat is unable to affect the quantum particle.<br/><br/>Copenhagen interpretation says that since you don't know the state of the particle until you observe it, it exists in both states, decayed and not decayed. Since that particle is the trigger to the cat's death, the cat is both dead and alive at the same time. It exists in both states, until we observe it. The act of observing the cat, and therefore the particle, forces it/them into a single state, and the cat will only remember that state (if it's alive to remember it).<br/><br/>I don't buy it. The cat is either dead, or alive. Not both. Can't be that way. That Wiki link contains an exceprt from a letter from Einstein. He agrees. Reality gets in the way of the experimental situation.<br/>

    When this experiment was tried, the cat said, "Stuff this!" and hoofed it through the nearest window.

    Schrodinger's thought experiment is meant to point out the paradoxes in using the Copenhagen Interpretation to explain macroscopic phenomena.

    Here's a non-quantum example: Assume coin flips are random. You have foolishly bet your friend that if a particular coin flip lands heads, you will give him 2$, and if it lands tails, you gain 1$ from him. As soon as the coin is tossed, you have on average lost fifty cents: (0.5)*(-2)+(0.5)*(1)=0.5, even though it is impossible for you to actually have lost this amount, because when the coin lands it is either tails or heads.

    My assertion is that Schrodinger's cat is not *actually* both alive and dead, just as the bottle of cyanide is not actually both broken and unbroken. However, because we are dealing with probability theory, and both states of the cat are equally likely, (0.5)*(living)+(0.5)*(dead)=equally living and dead. Both of these are examples of how probabilistic models of reality do not ever correspond to the result of any given real incident.

    A real life example: I study trees. I have measured the diameter of many trees, and discovered that the arithmetic mean value is some number, say 23 centimeters. This is meaningless on its own until I also tell you that the 95% confidence interval for that estimate is from say... 22.5 to 23.5 centimeters. We "know" that there is in fact a "real" mean value... but not only is it not necessary for any actual tree to have that diameter... but the real mean value is (sometimes equally) likely to be either greater or less than our estimated value.

    Similarly, if we were (in rather poor taste I think) run Schrodinger's thought experiment in real life ten thousand times... we would have approximately 5000 live and 5000 dead cats. At no point in reality is any cat really alive and dead except on paper...

    Probabilistic models do not describe what the status of any actual event will be while it is happening, only the average final outcome of the event if it is run many times. The Copenhagen Interpretation is interesting because it is a demonstration of a system in which the *only* models that predict behavior are probabilistic ones. Furthermore, it may suggest that the only models that predict anything are probabilistic, and that the "reliability" of classical physics is only a result of the many combined probabilistic events in macroscopic systems. As a side note, you then have Chaos Theory which asserts that extremely large, complex systems are also probabilistic... a beautiful and inconvenient symmetry, don't you think?

    ...And there you have my Copenhagen Interpretation Interpretation.

    Sweet flaming bagels, is there no way to edit comments? The first sentence should read: Schrodinger's thought experiment is meant to point out the paradoxes in using Quantum Mechanics to explain macroscopic phenomena. The Copenhagen Interpretation attempts to resolve this and other paradoxes.

    Well do you ever worry that your cat has died while you where out? It's confusing but it makes sense in terms of us, but whatabout things we observe wihtout understand/interpreting... It makes more sense in a mental way than physical...

    OK, well I'm too lazy to post the newer developments here, just go read the talk @ defcon 17.

    Thanks for this. I'm a lapsed physicist, and I really enjoy things like this that don't require me to spend weeks relearning calculus. Looking forward to your statistical analysis.

    5 replies

    Wait, you don't really need calculus in the real world?? Cool! (maybe its just one of those GPA things)

    Calculus+statistics=winning at computer games.<br/><br/>Or at investing, if you live in the past and don't consider it just another computer game.<br/>

    lol. I'm not that much of a math fan to go so far as to calculate the odds of me beating others(or the computer). I'm a fan of strategy games(command & conquer) & FPS games. I'm just glad that It's not imperative to doing engineering.

    I finished the stats analysis, I produced 100 megabits of data by sampling this system with a microcontroller running a special case of a von neuman whitening algorithm. This took a rather long time, as you might imagine.

    I tested the output in Linux with the NIST test suite:

    The results were pretty good. Random data has a predictable failure rate for tests for non-randomness (as strange as that sounds). The observed failure rate was not significantly different than the expected rate.

    Thanks! My new oscilloscope does not seem to have high enough gain on the input amplifier to observe the effect as on my old scope... So it looks like I'll have to wait until I have time to build an external amplifier using some mosfets.

    So perhaps a use for this would be to generate unpredictable random numbers to seed computer encryption algorithms?

    Geez... you mathematicians are in a world all of your own. When I read this instructable, not only did my wave function collapse, but my BRAIN collapsed along with it!