Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
Build Hardware

Real Steampunk Computer Brought Back To Life 81

New submitter engineerguy writes We discovered a 100 year old 19th century computer that does Fourier analysis with just gears spring and levers. It was locked in a glass case at the University of Illinois Department of Mathematics. We rebuilt a small part of the machine and then for two years thoroughly photographed and filmed every part part of the machine and its operation. The results of this labor of love are in the video series (short documentary), which is 22 minutes long and contains stunning footage of the machine in action — including detailed descriptions of how it operates. The photos are collected in a free book (PDF). The computer was designed by Albert Michelson, who was famous for the Michelson-Morley experiment; he was also the first American to win a Nobel Prize in physics.
This discussion has been archived. No new comments can be posted.

Real Steampunk Computer Brought Back To Life

Comments Filter:
  • 1914 is not the 19th century. I imagine this person still uses 'turn of the century' to refer to the 1900s, too. In a similar vein, an actual 19th century computer, there is Babage's Difference Engine [youtube.com] (tighter shots here [computerhistory.org]) which is very impressive to watch as well.
    • Re:100 Year old (Score:5, Informative)

      by ChrisSlicks ( 2727947 ) on Saturday November 15, 2014 @05:07PM (#48393609)
      The machine was designed in the late 19th century (1897) and a working prototype was built. This particular machine was from 1914.
      • Correction, they don't know when it was built exactly. Likely somewhere between 1901 and 1910. So a early 20th century machine based on a 19th century design.
        • Great you guys.... The post is a wonderfully pedantic argument about dating (things, not people, we don't worry about the latter around here). Nothing about the actual substance of the post (which is pretty cool, beats Bennett Halselton posts any day).

          I think the Aspberger's pheromone is strong today. Lighten up. At least say "Cool, but ...."

          Group hug time?

          • Was just pointing out that it really is a 19th century machine (design) as stated, but yes specific date is irrelevant. And yes it is cool, and I watched all the videos. Mind blowing that he was during mechanical fourier analysis at the time. It was a great period when several mathematical greats where also great engineers.
            • Re: (Score:2, Interesting)

              by mikael ( 484 )

              Fourier analysis was first developed in the 1800's. It took 80 years for the first programmable mechanical hardware to appear in the form of weaving looms in the 1880's. Then the development of mechanical analysis systems like this happened another 20 or 30 years later. Another 70 years, and we can play music on our home PC's and see funky animated digitial audio equalizers.

              http://en.wikipedia.org/wiki/J... [wikipedia.org]

              • by Teancum ( 67324 )

                Far more relevant in the 1880s is the United States Census for 1880 that took over 12 years to compile. The U.S. Census Bureau realized they would continue to fall behind unless they made some substantive changes to how they compiled the statistics which Congress insisted upon, not to mention plotting out the data needed for making district maps for Congress as required by the Constitution.

                That is how you got Herman Hollerith who made the punch card through a system that census workers would input data abo

          • "The post is a wonderfully pedantic argument about dating (things, not people,"

            Of course! I don't think any of them are likely to have any experience dating people!

          • beats Bennett Halselton posts any day

            You're kidding, right? He's a fucking regular contributor, show some respect.

    • 1914 is not the 19th century....

      If I drive an entirely rebuilt-from-new-materials last month classic 1967-9 muscle car, I suppose you'll say I'm driving a 21st century automobile. I could be wrong, but I think all the computers sold commercially today and for the forseeable future are in fact 20th century computers, regardless of the date of manufacture. A computer built in 1914 is not necessarily a 20th century computer, and your point is, in fact, irrelevant. But the links are cool, thx.

    • 1914 is not the 19th century.

      That being rather obvious you ought to have stopped and asked yourself the mandatory question "What is it that I'm not getting?"

      Then RTFM which starts with the words "This book celebrates a harmonic analyzer designed in the late nineteenth century by the physicist Albert Michelson," his progress is described below:

      [Michelson] first built a 20-element analyzer, one that calculates with 20 sinusoids with radian frequencies starting at 1, the fundamental, followed by the harmonics 2, 3, and so on up to 20. He found the “results obtained were so encouraging that it was decided to apply to the Bache Fund for assistance in building the present machine of eighty elements.” His application succeeded: he got $400.00. With those funds he built a harmonic analyzer with 80 elements, which he described in detail in an article published in The American Journal of Science [in 1898].

  • by houstonbofh ( 602064 ) on Saturday November 15, 2014 @04:44PM (#48393491)
    Now OpenBSD if going to need to buy more old hardware to support builds...
  • We know. It was on Hacker News days ago.

    When the guy publishes the videos of how to use it for Fourier analysis, that will be interesting. It's obvious how synthesis works, but not how the reverse operation works.

  • "Computer" (Score:5, Interesting)

    by vikingpower ( 768921 ) on Saturday November 15, 2014 @05:26PM (#48393711) Homepage Journal

    "Computer", actually, has the meaning: "Machine that performs computations". In that sense, this contraption truly is a computer. It probably only has a memory size of only a few bytes, in modern terms, and can only do a few FLopS also. Yet, it is a computer, in all senses of the word.

    Funny. I always thought of Michelson as of one of the two guys involved in the "failed" mirror experiments that allowed A. Einstein to come up with the theory of Special Relativity. Not so, it turns out now: the guy was an accomplished engineer. How great.

    • Re:"Computer" (Score:5, Interesting)

      by calidoscope ( 312571 ) on Saturday November 15, 2014 @05:34PM (#48393767)
      Michelson did a lot of work on measuring the speed of light, one of the last measurements he did involved a mile long vacuum chamber. As with many experimental physicists, he had to be an accomplished engineer as well in order to conduct his experiments.
      • by jnork ( 1307843 )

        Higgledy Piggledy
        Albert A. Michelson
        Did his experiment,
        Came away miffed;
        "Need a more accurate
        Interferometer.
        Back to the drawing board;
        Can't get the drift."

    • Back when this machine was made, "computer" actually had the meaning "person that performs computations".
      • by Teancum ( 67324 )

        While partially true, there were a great many mechanical analog computers which did a great many things and were widespread in the early 20th Century... including when this particular machine was made.

        A good video that shows how some of those mechanical computers were made can be found in this U.S. Navy training film:

        https://www.youtube.com/watch?v=s1i-dnAH9Y4 [youtube.com]

        Computers like this were used as early as the Spanish-American War and the Crimean War. A much older computer was found in the form of the Antikyther [wikipedia.org]

      • I checked that in vol. 3 of the 20-volume Oxford English Dictionary, my proudest material possession. You are right. Up to at least the 1850s, as supported by the extensive corpus of citations in the OED, "computer" meant "a person performing computations". The first solidly documented occurrence of the word as "machine performing computations" is from 1897; from 1915 on, the word is only found in this sense, i.e. the sense of "person performing computations" has then fully disappeared, in a period of only

    • by bjs555 ( 889176 )

      Actually, I've read that Einstein probably wasn't aware of the Michelson-Morley experiment. His reason for rejecting the existence of the ether was based on a thought experiment as mentioned in many non-technical books on relativity. However, nearly all of these books fail to mention what the thought experiment was. I finally found one explanation of it in the book "The Big Bang" by Simon Singh. According to him, Einstein's thought experiment is such:

      Get into a vehicle traveling at constant velocity through

    • by casings ( 257363 )

      While I don't disagree that this is should be classified as a "computer." It is obvious by your comment that you didn't watch the video.

    • Yet, it is a computer, in all senses of the word.

      Except the sense of the word most commonly used today: it wasn't Turing complete. So, it's not a computer.

      • A computer does not necessarily have to be Turing complete. There is no formally constrained definition of "machine performing computations" that also involves "Turing complete", being simultaneously universally valid. At least, none that I know of.
        • ok, so you are ignorant of the common meaning of words? What do you want me to say?

          Before 1948 a computer was a person, someone who computed (often in an assembly line with other people).
    • by k6mfw ( 1182893 )

      Funny. I always thought of Michelson as of one of the two guys involved in the "failed" mirror experiments that allowed A. Einstein to come up with the theory of Special Relativity.

      What also impresses me is him and Morley were wondering how fast Earth was moving through space during the times of cowboys and indians. Because their mirror set kept producing same c, they continued to build more elaborate sets (which were more complex engineering feats).

  • Mind blown (Score:5, Insightful)

    by Puff_Of_Hot_Air ( 995689 ) on Saturday November 15, 2014 @06:05PM (#48393937)
    There are times when I do things that I think are pretty smart, and then I see something like this and am humbled. It staggers the imagination to envisage how this Albert fellow was able to design this incredible machine. It's marvellous to watch, and beautiful in its operation. This is how Fourier analysis should be taught! Nothing has brought it more alive for me than watching this documentary. I desperately want one; I don't think I've ever seen a machine more beautiful.
    • Re:Mind blown (Score:5, Interesting)

      by hey! ( 33014 ) on Saturday November 15, 2014 @11:04PM (#48395029) Homepage Journal

      I was in the last cadre of high school student to learn the slide rule. I did trig and math problems on a Picket N800 [sliderulemuseum.com], although later I preferred a circular Scientific Instrumentys 300B [sliderulemuseum.com].

      The idea of building a machine to perform mechanical analog computation is not so outside the box for anyone who's ever done analog computation by hand. A repetitive series of calculations boil down to a repetitve sequence of movements, and in particular if you used a circular slide rule the idea of some kind of gear train to do the calculation woudl have been obvious.

      Which is not to say the devices weren't ingenious. But except for the abacus and the adding machine, analog contraptions were the only way to do computation other than by handwriting.

      • Re: (Score:3, Interesting)

        by bjs555 ( 889176 )

        Playing with a slide rule is like saying after sex, "I haven't had this much fun since I first encountered logarithms." In my case, I'd have to sadly admit that I've had more fun quantity wise with logarithms. Seriously, though, I too was in school at the time of the slide rule's demise. They were interesting to use. I recall using electronic analog computers at about the same time. They consisted of a patch board and a number of op amp differentiators, integrators, and gain blocks. You could use the patch

    • I agree with your comments wholeheartedly. This was simply amazing, and took an amazing mind to design and build.
  • by caseih ( 160668 ) on Saturday November 15, 2014 @06:18PM (#48394007)

    Wow, that was an amazing set of videos. Particularly how the machine can do decomposition. What a brilliant man who designed this machine.

    All analog computers fascinate me. Apparently analog computers implemented fire control on navy ships for many years, compensating for the speed, direction, and roll of the ship in order to aim guns. The accuracy of such a system was impressive, and they were used up until the 1980s on some older ships. Digital systems simply couldn't get the accuracy for many years.

    https://www.youtube.com/watch?... [youtube.com]

    Slide rules are very cool as well. I want to learn how to use one.

    • by techno-vampire ( 666512 ) on Saturday November 15, 2014 @09:00PM (#48394577) Homepage
      Back when I was in the Tonkin Gulf Yacht Club in '72, our ship carried a 5"/54 gun, which was aimed using a mechanical analog computer. I know that the Iowa Class Battleships all used mechanical fire control both because it was more than accurate enough for the job and because it was specifically designed to ignore the shocks caused by firing the main battery, as well as the bigger shocks caused by incoming shells, bombs and torpedoes.
      • Yeah, I was in that Yacht Club the exact same year. Our ship, the Goldsborough, had two of those guns aimed by mechanical computers; it was decommissioned in 1993, the last of its class in the US Navy, and I'm reasonably certain that her mechanical computers were not replaced with electronic ones (although certainly electronic computers were installed for other purposes). So mechanical analog computers were used until at least then in the US Navy. Several other ships of its class were still in commission

    • Digital systems simply couldn't get the accuracy for many years.

      That makes no sense. While analog computers have inherent accuracy limitations, digital computers provide arbitrarily-accurate computations.

      I suspect the problem was speed, not accuracy. More precisely, that digital computers couldn't compute sufficiently-accurate results fast enough.

      Slide rules are very cool as well. I want to learn how to use one.

      That they are. I recently taught myself to use one; it's fun. I can't say that I'm proficient, and I'm sure I never will be fast, but it is fun.

  • Is it just me or does Albert Michelson look the spitting image of Bill Murray?

    Go to 1.20 of the first video to see his picture...
  • by TropicalCoder ( 898500 ) on Saturday November 15, 2014 @09:03PM (#48394591) Homepage Journal
    Step 1.) Put a motor on the crank. Step 2.) Read the output into your computer with an optical mouse in place of the pen. Step 3.) Figure out a way to automate programming of the input. Step 4.) Sell it as a coprocessor! Step 5.) Profit!
  • Michelson designed the machine to run on luminiferous aether.

    • ,,, and this and other early examples used all the luminiferous aether up, which is why there isn't any now. More proof, if any was needed, that "Science" only creates self fuffilling prophecies to keep its priesthood in jobs. ;-)

  • When Michelson attempted to create a square wave from Fourier series (on the gear machine prototype), he discovered what became known as Gibbs's phenomenon http://en.wikipedia.org/wiki/Gibbs_phenomenon. He mentioned the bug to Gibbs, who "discovered" it.
    He was a master engineer and builder, improving many optical measurements. For example, measuring the meter by comparing the length of the metal bars to wavelength of light by counting lots of fringes. The Fourier analysis computer was made to calculate

  • Although I don't think he had success due to the limits of engineering tech in the mid nineteenth century, I always thought Charles Babbage was considered the father of the computer, aka his analytical engine. Are there not blueprints to his failed machine available that can be worked on? BTW Bruce Sterling and William Gibson co-wrote a pretty interesting novel of how the world would could be circa late nineteenth century if the analytical engine had been successfuly built. Addiontally wasn't countess Ad

"If it's not loud, it doesn't work!" -- Blank Reg, from "Max Headroom"

Working...