PDA

View Full Version : Which is more accurate, an analog or a digital clock?


Bell
28th February 2011, 11:13 AM
Hypothetical speaking, I think an analog clock is more accurate since digital clocks always needs to jump from one timestamp to another, where analog travels the whole distance.

TubbaBlubba
28th February 2011, 11:15 AM
Evidence?

Ziggurat
28th February 2011, 11:19 AM
Hypothetical speaking, I think an analog clock is more accurate since digital clocks always needs to jump from one timestamp to another, where analog travels the whole distance.

Hypothetically speaking, what you refer to is actually precision and not accuracy. Though commonly conflated, these are actually distinct concepts. Furthermore, you're presuming infinite precision on an analogue clock, but quantum mechanics prohibits that from actually being true.

Beanbag
28th February 2011, 11:21 AM
"Accurate" is a relative term. Based on your analogy, a slide rule should be more accurate than a computer, because the slide rule represents the value exactly on a graduated scale, while a computer can only resolve as fine as the smallest binary value it can express in whatever number handling scheme it uses (i.e. floating point).

For five bucks, I can buy a digital watch that puts the finest analog mechanical marine chronometers ever made to shame.

Alarm clocks typically only need resolution to the minute, or at least that's fine in my life. I don't need to wake at 05:00:00 every morning. Somewhere between 04:58:00 and 05:02:00 works just fine for me.

Beanbag

BenBurch
28th February 2011, 11:28 AM
Analog clocks are "binary" machines too;

http://www.angelfire.com/ut/horology/escapement.html

Binary in the sense that they have two states; Free running, and stopped by the escapement.

blutoski
28th February 2011, 12:03 PM
Analog clocks are "binary" machines too;

http://www.angelfire.com/ut/horology/escapement.html

Binary in the sense that they have two states; Free running, and stopped by the escapement.

As Ziggurat explained, even if analog clocks have better resolution and are more precise, they're clearly less accurate.

In terms of precision, there's friction and inertia in mechanical works that I'm pretty confident produce greater error bars than the +/-0.05 microsecond of discrepancy in even the cheapest digital clocks (typically 10MHz).

Beanbag
28th February 2011, 12:13 PM
For some reason, this thread reminds me of the Boeing engineer who kept sending his digital watch back for repair because it was 0.02 seconds a day fast.

Beanbag

I Ratant
28th February 2011, 12:15 PM
Both depend on being set properly to start.
My "atomic clocks" use signals from WWV to remain in sync with that source.
The usual plug-in clock uses the 60 hz lines frequency for a starting point.
I have one ac powered digital that advances about 15 seconds a day.
Every 3 or 4 days I have to retard it to the current time.
It's the alarm clock that turns on the light to wake me in the morning, and controls a few appliances for daily their charging time.
My digital wrist watch keeps excellent time.. but it's dependent on how accurately I set it to a time standard.
The clock on the cellphone and on the cable box is quite accurate relative to the atomic clocks, which use batteries for power.

Dr. Keith
28th February 2011, 12:22 PM
Hypothetically speaking, what you refer to is actually precision and not accuracy. Though commonly conflated, these are actually distinct concepts.

Yep.

A clock that is only precise to the nearest minute may be more accurate than one that is precise to the thousandth of a second.

I don't know how many lab courses it takes for this to be hammered home conclusively, but I know at some point in undergrad it really started to bother me that people confused these concepts.

Side note: The weird thing about using a cellphone for a clock is that most people agree about what time it is. The old adage of "In a room with one clock everyone knows the time while in a room with two clocks no one does" is now quaintly anachronistic. The old movie scenes of the co-conspirators syncing their watches are kinda funny now.

Quinn
28th February 2011, 12:28 PM
A clock that is only precise to the nearest minute may be more accurate than one that is precise to the thousandth of a second.

I don't know how many lab courses it takes for this to be hammered home conclusively, but I know at some point in undergrad it really started to bother me that people confused these concepts.


Speaking as someone who initially found that distinction hard to grasp, I think your clock explanation may the most succinct and understandable I've seen. If my high school chem teacher had put it that way, I think I would have gotten it right off the bat.

Ivor the Engineer
28th February 2011, 12:33 PM
As Ziggurat explained, even if analog clocks have better resolution and are more precise, they're clearly less accurate.

In terms of precision, there's friction and inertia in mechanical works that I'm pretty confident produce greater error bars than the +/-0.05 microsecond of discrepancy in even the cheapest digital clocks (typically 10MHz).

Most cheap crystals have an initial accuracy of +/-100ppm, which at 10MHz would be +/-1kHz, resulting in a 1 second tick having an error of up to +/-100us.

Most digital watches use 32.768kHz crystals, which often have an initial accuracy of +/-20ppm.

The initial accuracy is only part of the error. The frequency a crystal resonates at is affected by temperature. A good solution is to clamp it to a hunk of meat held constant at 37 deg. C. The resonant frequency of a crystal also changes slowly over time.

Dr. Keith
28th February 2011, 12:50 PM
Speaking as someone who initially found that distinction hard to grasp, I think your clock explanation may the most succinct and understandable I've seen. If my high school chem teacher had put it that way, I think I would have gotten it right off the bat.

Thanks. I don't remember when I "got it" but I do remember being annoyed in college when we had to go over it ad nauseum. I remember thinking "Should you really be taking an advanced materials lab if you are having trouble with this?" I was not always polite in my thoughts.

TubbaBlubba
28th February 2011, 12:51 PM
If I understand this correctly...

On one hand, you have a glass marked with squiggly lines purportedly each marking a deciliter.

On the other hand, you have a pint glass that's measured to ten significant numbers and cut with laser beams to molecular perfections.

The first glass is more precise, the pint glass is more accurate?

Beanbag
28th February 2011, 12:53 PM
Little known fact (outside of the watchmaker community): the crystals in modern quartz ANALOG watches are deliberately cut to run FAST. Adjusting the rate is done by inhibition, where the timing machine determines the actual frequency, then instructs the internal divider chain in the watch's circuitry to "ignore" so many counts over a certain interval of time to slow the watch back down to normal.

The programming is all digital, done through the battery terminals using a special machine.

It used to be that the old quartz watches had a trimmer capacitor for adjusting the rate, but you don't find them any more.

Beanbag

TheGoldcountry
28th February 2011, 12:56 PM
All of these posts remind me of the futility of the desire for "exact time." In olden days, monks used candles to measure time at night, when there was no sun. The accuracy of their candles was pretty good. As soon as you look at a watch, clock, computer, cell phone or anything else that records the passage of time, the time is, by definition, different. I can predict that two days from now, the sun will be in a certain point in the sky. But I can never truthfully say that I saw it in that exact point, because it's constantly moving (or my vantage point is).

Beanbag
28th February 2011, 01:03 PM
And on a similar vein, marine chronometers weren't designed to be particularly accurate. They were designed to be CONSISTENT. Over wildly varying ranges of temperature and position, they maintained a consistent RATE of timekeeping, varying very little. It didn't matter to a navigator so much that the time displayed wasn't dead on. Instead, he knew it had been so many days since they left port (and the chronometer had been set to standard time). So by multiplying the days from port by the daily rate of the chronometer, he knew how many seconds to add or subtract from the chronometer reading to get the precise time.

It's actually much easier to design and build a mechanism that runs consistently than it is to make a mechanism that runs consistently AND at the correct rate.

Beanbag

Ivor the Engineer
28th February 2011, 01:05 PM
Little known fact (outside of the watchmaker community): the crystals in modern quartz ANALOG watches are deliberately cut to run FAST. Adjusting the rate is done by inhibition, where the timing machine determines the actual frequency, then instructs the internal divider chain in the watch's circuitry to "ignore" so many counts over a certain interval of time to slow the watch back down to normal.

The programming is all digital, done through the battery terminals using a special machine.

It used to be that the old quartz watches had a trimmer capacitor for adjusting the rate, but you don't find them any more.

Beanbag

Interesting. I didn't know that.

Beanbag
28th February 2011, 01:13 PM
Interesting. I didn't know that.
Fewer things for non-watchmakers to fiddle with and mess up when they open the watch, plus you don't lose the calibration when you disassemble, clean, and reassemble the movement.

Unfortunately, that means having to spring about $3,000 USD for a Witschi Q-6000 watch tester if you want to adjust the rate.

Beanbag

jsfisher
28th February 2011, 01:15 PM
If I understand this correctly...

On one hand, you have a glass marked with squiggly lines purportedly each marking a deciliter.

On the other hand, you have a pint glass that's measured to ten significant numbers and cut with laser beams to molecular perfections.

The first glass is more precise, the pint glass is more accurate?


The way I learned it was that precision relates to repeatability and accuracy is closeness to the target. In archery terms, arrows generally centered on the bulls eye is accuracy; arrows tightly clustered (anywhere on the target) is precision.

They probably were lying when they told me that. ;)

Ivor the Engineer
28th February 2011, 01:17 PM
The way I learned it was that precision relates to repeatability and accuracy is closeness to the target. In archery terms, arrows generally centered on the bulls eye is accuracy; arrows tightly clustered is precision.

They probably were lying when they told me that. ;)

Not at all, that's an accurate and precise description.;)

Dr. Keith
28th February 2011, 01:18 PM
If I understand this correctly...

On one hand, you have a glass marked with squiggly lines purportedly each marking a deciliter.

On the other hand, you have a pint glass that's measured to ten significant numbers and cut with laser beams to molecular perfections.

The first glass is more precise, the pint glass is more accurate?

Backwards, I think, although it is harder to imagine volume measures that are precise and inaccurate. Here is my best attempt:

Take a glass with marks every 50ml that has been tested t be within 0.5ml of being right at each mark. It is quite accurate and relatively precise for most uses, so long as you need a measure of some multiple of 50ml. If it is between marks the precision starts to influence the usefulness, as it is hard to measure 130ml in such a glass.

Take another glass with laser etched increments down the side every 5ml. Then imagine the laser etcher was off that day and the etchings are "off" by about 50ml. In other words, when the glass says you have 55ml you actually have 105ml. This glass is more precise than the previous glass, because it can measure down to the closest 5ml, but it is less accurate because if you use its measure you will be more wrong.

The second glass above would be trashed by the manufacturer in most cases, so it is a bit harder to imagine encountering this in a real lab. That is why this is easier to explain with instrument that require calibration, such as thermometers, spring scales, and clocks, but I tried to stick to your hypo and move the calibration back to the original marking.

The key is that accuracy and precision are not dependent on each other except in that a device may only be accurate to its level of precision. An accurate calender can't tell you the time of day, but an inaccurate clock (with date function) may have you on the wrong day to begin with.

Ziggurat
28th February 2011, 01:26 PM
The way I learned it was that precision relates to repeatability and accuracy is closeness to the target. In archery terms, arrows generally centered on the bulls eye is accuracy; arrows tightly clustered (anywhere on the target) is precision.

They probably were lying when they told me that. ;)

The arrow analogy is a decent one. But precision isn't just about repeatability, it's also about the degree of discrimination. So a measurement with 5 decimal points is more precise, though not necessarily more accurate, than a measurement with 2 decimal points.

bokonon
28th February 2011, 01:29 PM
A clock that is only precise to the nearest minute may be more accurate than one that is precise to the thousandth of a second.

I don't know how many lab courses it takes for this to be hammered home conclusively, but I know at some point in undergrad it really started to bother me that people confused these concepts.

Speaking as someone who initially found that distinction hard to grasp, I think your clock explanation may the most succinct and understandable I've seen. If my high school chem teacher had put it that way, I think I would have gotten it right off the bat.

Since you're most likely using your time machines to measure intervals in the lab, unless they're wildly inaccurate I don't see why it's even an issue. A clock that is only precise to the nearest minute is going to be useless when the protocol calls for 25 seconds of something, no matter how accurate it is.

TubbaBlubba
28th February 2011, 01:38 PM
Backwards, I think, although it is harder to imagine volume measures that are precise and inaccurate. Here is my best attempt:

Take a glass with marks every 50ml that has been tested t be within 0.5ml of being right at each mark. It is quite accurate and relatively precise for most uses, so long as you need a measure of some multiple of 50ml. If it is between marks the precision starts to influence the usefulness, as it is hard to measure 130ml in such a glass.

Take another glass with laser etched increments down the side every 5ml. Then imagine the laser etcher was off that day and the etchings are "off" by about 50ml. In other words, when the glass says you have 55ml you actually have 105ml. This glass is more precise than the previous glass, because it can measure down to the closest 5ml, but it is less accurate because if you use its measure you will be more wrong.

That's pretty much the same thing as I said, isn't it? The glass with hand-drawn deciliter lines can measure integer multiples of a deciliter, but they won't be very accurate deciliters. So it is precise to a deciliter, but it is not an accurate deciliter.

The second glass can only measure integer multiples of a pint, so its precision is to a pint (i.e. less precise than to a deciliter), but they're very, very accurate pints.

Dr. Keith
28th February 2011, 02:02 PM
That's pretty much the same thing as I said, isn't it? The glass with hand-drawn deciliter lines can measure integer multiples of a deciliter, but they won't be very accurate deciliters. So it is precise to a deciliter, but it is not an accurate deciliter.

The second glass can only measure integer multiples of a pint, so its precision is to a pint (i.e. less precise than to a deciliter), but they're very, very accurate pints.

Yep, I don't really know why I didn't get it the first time you said it.

Dr. Keith
28th February 2011, 02:07 PM
Since you're most likely using your time machines to measure intervals in the lab, unless they're wildly inaccurate I don't see why it's even an issue. A clock that is only precise to the nearest minute is going to be useless when the protocol calls for 25 seconds of something, no matter how accurate it is.

Right, it is the wrong tool for the job, but not due to a lack of accuracy, only a lack of precision.

It is typically an issue where someone asserts that a timepiece with four digits to the right of the decimal is more accurate than one with three digits to the right of the decimal. Especially when the required precision is to the full second. The added precision is useless and leads to a false assumption of added accuracy.

Ivor the Engineer
28th February 2011, 02:13 PM
Right, it is the wrong tool for the job, but not due to a lack of accuracy, only a lack of precision.

<snip>

I'd say it was because of the insufficient resolution of the clock, rather than it's precision.

I Ratant
28th February 2011, 04:09 PM
All of these posts remind me of the futility of the desire for "exact time." In olden days, monks used candles to measure time at night, when there was no sun. The accuracy of their candles was pretty good. As soon as you look at a watch, clock, computer, cell phone or anything else that records the passage of time, the time is, by definition, different. I can predict that two days from now, the sun will be in a certain point in the sky. But I can never truthfully say that I saw it in that exact point, because it's constantly moving (or my vantage point is).
.
During Alexander's rampages through the world, the various shift changes for the guards were determined by soaking a rag in water, and tying it around the guard's wrist.
When the cloth dried out completely, it was time for the next shift to show.
They called the cloth Alexander's Rag Time Band.

CapelDodger
28th February 2011, 06:56 PM
This glass is more precise than the previous glass, because it can measure down to the closest 5ml, but it is less accurate because if you use its measure you will be more wrong.

Precisely right.

CapelDodger
28th February 2011, 06:59 PM
Damn, my avatar's right next to I Ratant's, he'll never settle now.

CapelDodger
28th February 2011, 07:02 PM
Right, it is the wrong tool for the job, but not due to a lack of accuracy, only a lack of precision.

It is typically an issue where someone asserts that a timepiece with four digits to the right of the decimal is more accurate than one with three digits to the right of the decimal. Especially when the required precision is to the full second. The added precision is useless and leads to a false assumption of added accuracy.

Yeah, but this goes up to 11.

TubbaBlubba
28th February 2011, 07:12 PM
Right, it is the wrong tool for the job, but not due to a lack of accuracy, only a lack of precision.

It is typically an issue where someone asserts that a timepiece with four digits to the right of the decimal is more accurate than one with three digits to the right of the decimal. Especially when the required precision is to the full second. The added precision is useless and leads to a false assumption of added accuracy.

This makes sense. It's terribly annoying when people report values to five significant numbers when they only measured two accurately...

Dr. Trintignant
28th February 2011, 11:56 PM
The usual plug-in clock uses the 60 hz lines frequency for a starting point.

The utility frequency is usually calibrated to an atomic clock. At any given moment, the time may be several seconds off, but the frequency is adjusted to correct for this in the long term. So, a plug-in clock with either a synchronous motor or frequency-calibrated electronic circuit will never need to be set (excepting DST). I have a clock that I haven't had to change the minutes setting on for years.

- Dr. Trintignant

Dr. Trintignant
28th February 2011, 11:58 PM
This makes sense. It's terribly annoying when people report values to five significant numbers when they only measured two accurately...

High precision with poor accuracy can still be useful if you're concerned about differences in the measurement in question.

- Dr. Trintignant

Beerina
1st March 2011, 12:03 AM
Hypothetical speaking, I think an analog clock is more accurate since digital clocks always needs to jump from one timestamp to another, where analog travels the whole distance.

It is, assuming the second hand is moving continuously.

Many analog clocks don't do that, though, making the seconds hand go "tick tick tick", making it indistinguishable from a digital clock.

So assume a continuously-moving analog clock. With a digital clock, at any given moment, when you look and get a "reading" to the second, but the actual time is some fraction of a second, you'll, on average, be 1/4s off, but since this is +/-, it averages out to 0*.

So...if you had infinitely good eyesight and looked at a continuously-moving analog clock, you could discern the time, say, to the closest 100th of a second.

Looking at a digital clock, limited to the second, while, on average over many readings, you'd be off by 0s, any particular reading would almost never be within 1/100th of a second of the actual time. Specifically, 99 times out of 100, the digital clock would be worse off than the analog clock. The smaller the reading (1/1000th, 1/10,000th, etc.) the less likely the digital clock is to be perfectly accurate on any single reading. So I suppose it would depend on how you define "accurate" -- any particular reading, or the average over many.


ETA: Even if we used "average" error over many readings, at best the clocks would be equal accuracy. For any single reading, the analog clock wins almost every time. Again that assumes fractions of a second as errors. If we limit everything to "the nearest second" in the case of readings of a continuously-moving analog clock, then said clock becomes indistinguishable from a tick-tick-tick analog clock, and thus from a digital clock.


Oh my nerdiness is in fine form tonight (http://forums.randi.org/showthread.php?t=202285).






* This assumes the period of quiescence while pointing at one particular number is centered, time-wise, about that exact indicated moment. This would normally be the case if the "tick" occurs exactly at the end of a second, neglecting the time spent moving the second hand. If the amount of time spent ticking grew longer and longer, you'd become more accurate, as you were approaching analog clock continuous movement! Of course, that also assumes our infinitely-good eyesighted test subject can accurately discern the time of a "ticky" analog clock, mid-tick! If limited to just the period of quiescence, they gain no bonus no matter how protracted the "tick" movement is, as long as it remains sub-continuous.

PixyMisa
1st March 2011, 12:30 AM
Hypothetical speaking, I think an analog clock is more accurate since digital clocks always needs to jump from one timestamp to another, where analog travels the whole distance.
Tick tock.

Andrew Wiggin
1st March 2011, 02:19 AM
A good solution is to clamp it to a hunk of meat held constant at 37 deg. C.

This makes it sound like the wrist is a poor location for a watch; the crotch watch or armpit watch would drift less, although be harder to read in public.

BTMO
1st March 2011, 02:51 AM
Write a time on a wall. At *precisely* the correct time, read it.

Congratulations - this is the most accurate clock in the world - for the briefest of instants.

The answer to your actual question is, neither, or perhaps, both. Neither digital nor analogue clocks are inherently more accurate than the other. It depends on many, many things.

Incidentally, I used to work in a calibration lab, in part of the national chain of standards in Australia. Our atomic clock had an analogue readout...

Toke
1st March 2011, 04:27 AM
On some vessels the ballast tank sensors are connected to a simple analog mA meter with a backplate showing e.g. 0-600 ton for the 4-20mA meter.

On others the sensor is connected to the ships computer system and give a readout to the kilo.

It is quite difficult to convince the bridge officers that there is no difference in sensor accuracy, and that the precision is pretty pointless if the accuracy is within 10 ton.

ingoa
1st March 2011, 05:02 AM
In physics, in most cases (as I see it9:

The error in precision = statistical error
The error in accuracy = systematic error

Cuddles
1st March 2011, 05:14 AM
My watch has both analogue and digital readouts. The digital is far more precise since it shows seconds, but there's no analogue minute hand. The digital part is also more accurate, since I somehow managed to offset the minute hand by a couple of minutes and I can't work out how to get it back again. So, digital is more accurate and more precise, unless someone can find the manual for my watch.

Ivor the Engineer
1st March 2011, 05:16 AM
In physics, in most cases (as I see it9:

The error in precision = statistical error
The error in accuracy = systematic error

Electronics/Signal processing:

The error in precision = noise
The error in accuracy = offset and/or gain error

jsfisher
1st March 2011, 06:24 AM
Electronics/Signal processing:

The error in precision = noise
The error in accuracy = offset and/or gain error

That sounds backwards.

69dodge
1st March 2011, 07:04 AM
With a digital clock, at any given moment, when you look and get a "reading" to the second, but the actual time is some fraction of a second, you'll, on average, be 1/4s off, but since this is +/-, it averages out to 0*.

[...]

* This assumes the period of quiescence while pointing at one particular number is centered, time-wise, about that exact indicated moment.

Right.

This would normally be the case if the "tick" occurs exactly at the end of a second, [...]

Shouldn't that be "exactly halfway between the beginning and end of each second"? For example, the clock reading should change from 10:40:12 to 10:40:13 when the time is exactly 10:40:12.5.

BenBurch
1st March 2011, 09:14 AM
As Ziggurat explained, even if analog clocks have better resolution and are more precise, they're clearly less accurate.

In terms of precision, there's friction and inertia in mechanical works that I'm pretty confident produce greater error bars than the +/-0.05 microsecond of discrepancy in even the cheapest digital clocks (typically 10MHz).

Not necessarily. The Shortt-Synchronome was a semi-mechanical clock that was amazingly accurate.

http://en.wikipedia.org/wiki/Shortt-Synchronome_clock

Dr. Keith
1st March 2011, 09:21 AM
This makes sense. It's terribly annoying when people report values to five significant numbers when they only measured two accurately...

Yeah, I get a bit frustrated with my grade school daughter's math homework sometimes because I want to correct the answer to the appropriate significant figures, but then I know she will be counted off as if she didn't do the calculations correctly. So, I tell her that one day she will notice . . . (lays groundwork for understanding significant figures).

blutoski
1st March 2011, 09:24 AM
Not necessarily. The Shortt-Synchronome was a semi-mechanical clock that was amazingly accurate.

http://en.wikipedia.org/wiki/Shortt-Synchronome_clock

For sure, but I'm speaking of the general situation. Not a lot of people walking around with watches that have pendulumns swinging in a vacuum.

If it came down to 'is the most accurate analog clock ever built better than a massmarket digital' I think it'd be a tough call.

I Ratant
1st March 2011, 09:33 AM
The utility frequency is usually calibrated to an atomic clock. At any given moment, the time may be several seconds off, but the frequency is adjusted to correct for this in the long term. So, a plug-in clock with either a synchronous motor or frequency-calibrated electronic circuit will never need to be set (excepting DST). I have a clock that I haven't had to change the minutes setting on for years.

- Dr. Trintignant
.
Mine tend to drift.
After every power outage, I reset all 7 of them using one of the atomic clocks, and get them within a minute, but who knows which end of the minute they're at when I set them.. :)
The clock on the computers, and the cable box are updated by their systems, but those numbers are too small to see from the bed. :)

Dr. Keith
1st March 2011, 09:34 AM
High precision with poor accuracy can still be useful if you're concerned about differences in the measurement in question.

- Dr. Trintignant

That assumes that the inaccuracy is linear and consistent. A bad battery connection may cause some very interesting inaccuracy in either a digital or analog time piece.

This reminds me of a scam several years ago where owners were rigging gasoline dispensers to be accurate when the amount purchased was exactly 5 gallons or exactly 10 gallons, but otherwise to overstate the amount of fuel dispensed. This was because the regulator who checked their accuracy always filled up ten gallon and five gallon containers. The regulators assumed any inaccuracy would be linear and consistent. Oops!

It had to be some time ago, because the reporters were shocked, shocked I tell you, that the lowly gas station owners could afford the pentium processors used in the scam. As if gas station owners couldn't also be technically competent and a few hundred dollars was a big deal when scamming thousands of dollars a day.

I Ratant
1st March 2011, 09:35 AM
High precision with poor accuracy can still be useful if you're concerned about differences in the measurement in question.

- Dr. Trintignant
.
Yes.
As long as the measurement procedure is consistent, errors can be ignored, in the measurements.
It just makes them hard to relate to other measurements.
It is nice to know what the starting point is though, accurately.

I Ratant
1st March 2011, 09:36 AM
Tick tock.
.
Tick tock goes the harlequin man.

I Ratant
1st March 2011, 09:39 AM
My watch has both analogue and digital readouts. The digital is far more precise since it shows seconds, but there's no analogue minute hand. The digital part is also more accurate, since I somehow managed to offset the minute hand by a couple of minutes and I can't work out how to get it back again. So, digital is more accurate and more precise, unless someone can find the manual for my watch.
.
One of my bosses in Instrumentation had a watch that was off by many minutes and seconds, and he'd make the adjustment mentally when asking him the time!
One of my self-winding watches -stopped- one day when I was driving from VA to CA, going across Kansas on 40.
There was no reason to move my left arm for such long period of time, the thing just stopped!

I Ratant
1st March 2011, 09:42 AM
Yeah, I get a bit frustrated with my grade school daughter's math homework sometimes because I want to correct the answer to the appropriate significant figures, but then I know she will be counted off as if she didn't do the calculations correctly. So, I tell her that one day she will notice . . . (lays groundwork for understanding significant figures).
.
Significant figure....

TheGoldcountry
1st March 2011, 12:07 PM
So...if you had infinitely good eyesight and looked at a continuously-moving analog clock, you could discern the time, say, to the closest 100th of a second.

Not relevant. We can qualitatively judge someone else's eyesight (or other observations) to better or worse than our own. No one possesses "infinitely good eyesight." Unless they're Superman, and I haven't met him yet.

Dr. Keith
1st March 2011, 12:12 PM
.
Significant figure....

Surprisingly, that is not what I was discussing with my daughter.

Dr. Trintignant
1st March 2011, 01:40 PM
That assumes that the inaccuracy is linear and consistent.

Indeed it does (I'll note that I did say "can"). Though it should be said that non-linearities can also be accounted for if necessary. Inconsistency is harder to overcome.

This was because the regulator who checked their accuracy always filled up ten gallon and five gallon containers. The regulators assumed any inaccuracy would be linear and consistent.

Heh--clever. Well, it was consistent at least! I wonder if there were true discontinuities in the reading, or if the operator just set the mapping function to "catch up" at the 5 and 10 gallon marks, and be depressed otherwise.

- Dr. Trintignant

Mikemcc
1st March 2011, 01:48 PM
When I was doing surveying I was taught that we could estimate to a 1/4 of a division on an analogue scale. I was using a 'director' - a theodolite used in miliary applications. It was marked down to mil, but from that could be read to an accuracy of an 1/4 of a mil (slightly better than a minute of arc). The builder's surveyors when our barracks were being re-designed, were very jealous!

An analogue clock can be very precise, but the precision will cost in sheer size of the device and in the movement. To build a readable scale at a high degree of precision would require a rather large device (unless employing expensive optics to read a small scale). A digital system is far easier, and cheaper, to build at a high precision.

This is not the same as accuracy. You can have a very precise but inaccurate instrument, this may be acceptable (as mentioned above), if you are only looking at differences rather than absolute values.

I Ratant
1st March 2011, 03:16 PM
Surprisingly, that is not what I was discussing with my daughter.
.
That's only one significant figure. :)
Taking something large that might interest anyone, like a mile, computing that to 5 significant figures versus 4 results in a difference of about 3 inches, hardly enough to be concerned with.
3 figures is the practical limit for anything in a day-to-day encounter.

Andrew Wiggin
1st March 2011, 10:34 PM
.
Tick tock goes the harlequin man.

Just never ask Harlan where he got the jellybeans.

nathan
2nd March 2011, 01:36 AM
Most digital watches use 32.768kHz crystals

This has always puzzled me. Dividing by 2^15 obviously gives you 1Hz, but the quartz watches I've had, have had a stopwatch function with finer granularity. The digital one I had had 10mS precision, the analogue one I have now has 1/5S increments. Neither of those are easily derivable from a 32KHz clock.

nathan
2nd March 2011, 01:42 AM
The digital part is also more accurate, since I somehow managed to offset the minute hand by a couple of minutes and I can't work out how to get it back again.

Oh come on, how hard can it be? It's not like it's nuclear physics or anything!

oh ...

TubbaBlubba
2nd March 2011, 02:50 AM
High precision with poor accuracy can still be useful if you're concerned about differences in the measurement in question.

- Dr. Trintignant

I think I understand what you mean, but can you give me an example of this?

nathan
2nd March 2011, 04:34 AM
I think I understand what you mean, but can you give me an example of this?

Differential GPS?

Ivor the Engineer
2nd March 2011, 04:39 AM
This has always puzzled me. Dividing by 2^15 obviously gives you 1Hz, but the quartz watches I've had, have had a stopwatch function with finer granularity. The digital one I had had 10mS precision, the analogue one I have now has 1/5S increments. Neither of those are easily derivable from a 32KHz clock.

32.768kHz x 25/8192 = 100Hz.

So build a 13-bit accumulator (0 to 8191) and every clock cycle of the 32.768kHz master clock, add 25 to the total. Allow the accumulator to overflow and use the rising edge of the most significant bit to clock the stop watch.

There will be a bit of jitter on the period (either 0.2% too short or 0.1% too long), but on average the rate will be 1 tick per 10ms.

For those interested, this is the idea behind a numerically controlled oscillator (http://en.wikipedia.org/wiki/Numerically-controlled_oscillator).

BenBurch
2nd March 2011, 10:29 AM
For sure, but I'm speaking of the general situation. Not a lot of people walking around with watches that have pendulumns swinging in a vacuum.

If it came down to 'is the most accurate analog clock ever built better than a massmarket digital' I think it'd be a tough call.

Source; http://www.chronocentric.com/watches/accuracy.shtml

Dr. Keith
2nd March 2011, 10:42 AM
Indeed it does (I'll note that I did say "can"). Though it should be said that non-linearities can also be accounted for if necessary. Inconsistency is harder to overcome.

Agreed, and I wasn't sure "linear" really is the best word to use, but it was what I could come up with and you fell for it, so forget I am now questioning it . . .

Heh--clever. Well, it was consistent at least! I wonder if there were true discontinuities in the reading, or if the operator just set the mapping function to "catch up" at the 5 and 10 gallon marks, and be depressed otherwise.

- Dr. Trintignant

I really don't remember the details other than people with X gallon tanks noticing that they were being charged for buying more than X gallons of gas, but only at certain stations. I guess most people don't drain it to a consistent enough mark, but I guess enough people did to make the regulators double check their measurement protocol.

Frankly, I was a bit impressed by the whole endeavor.

Dr. Keith
2nd March 2011, 10:47 AM
I think I understand what you mean, but can you give me an example of this?

If you go back to the graduated cylinder that has 5ml increments that are off by 50ml, you can measure which orange produced more juice and how much more juice with such a cylinder, but you can't measure how much juice was produced by each orange. Differences, but not absolutes.

This works better with a clock that is not set to the right time, but does not slow down or speed up over time. Using differential measurements on that clock you can tell that runner A ran the mile in 5 minutes 4 seconds and runner B ran the mile in 5 minutes 12 seconds, but you can not tell what time of day they actually started running or the time when they finished running. Differences, not absolutes.

I Ratant
2nd March 2011, 10:59 AM
Just never ask Harlan where he got the jellybeans.
.
And what he did with them later!
Ewwwwwww!

Dr. Trintignant
2nd March 2011, 03:44 PM
I think I understand what you mean, but can you give me an example of this?

Sure. In fact, the tare button on a scale uses it as a feature. Put a 1 kg mass on the scale, press tare, and now your scale is reading 1 kg off from what it "should be." But if you add another 5 grams, the scale (if it has enough precision) will read the correct 5 grams.

Another example might be an alcohol/mercury thermometer with a sliding scale. One can slide the scale to a random position (within limits), and although you won't know the actual temperature, you can confidently tell if the temperature increases by 5 degrees.

Calculating altitude from barometric pressure is another example relevant to pilots. To get an accurate ground reference, pilots must calibrate their altimeters to the local barometric pressure reading. If you don't do this, you won't know where the ground is, but you can still tell if you've increased or decreased altitude by a certain amount.

Dr. Keith's examples are also good.

- Dr. Trintignant

Andrew Wiggin
2nd March 2011, 10:06 PM
.
And what he did with them later!
Ewwwwwww!

Ewwww? IIRC he dumped them into the gears of an escalator to gum up the works and make people late.

For some reason, Harlan is enraged by this question though. Of course Harlan is famous for being easily enraged, amongst other things. Giving someone new to the sci-fi con scene a bag of jellybeans and telling them they're Harlan's favorite treat is a much abused prank, at least in anecdote.

TheGoldcountry
3rd March 2011, 12:39 AM
I think that most of us are forgetting the most important point- whatever instrument we use to measure time (chemical, mechanical, electronic, or even natural or biological) it's always clouded by our own viewpoint. As I said before, we can personally observe that action X takes Y seconds to affect object Z, but we can't honestly say that we witnessed that event, because time is fluid.

MRC_Hans
3rd March 2011, 01:13 AM
I think it has already been covered, but allow me to sum up, since this is within my area of expertise ;).

A clock is a measuring instrument, measuring time.

The following, essentially unrelated, factors determine the 'goodness' of a measuring instrument:

Accuracy; how closely related the readout is to the measured entity.

Resolution; how finely graded the readout is (e.g. the number of digits on a digital readout).

Repeatability; how close repeated readings of the same entity is (this includes how well the user can discern differences).

Calibration; how well the instrument is aligned to the desired unit of measure, or, in case of a clock, the local reference.

Now let's look at a clock:

Accuracy: Here it means how closely it keeps time. In principle, digital and analog clocks have the same potential, but in practice, electronic time-keepers can be made far more accurate (this, of course, also applies to electronic clocks with an analog read-out).

Resolution: In principle, a digital clock can be fitted with many decimals, but your usual time-piece has a one-second resolution. So, provided the analog readout has a second-hand, it has the same or better resolution.

Repeatability: For a clock, this is mainly the user's ability to discern the readout. Here, the digital readout is less ambiguous.

Calibration: For a clock, there are two kinds of calibration. The running time, which affects accuracy (in form of the device's ability to keep its accuracy over time). For technical reasons, the electronic timepiece has the advantage here. The other calibration is the setting of the local time. As already mentioned, in principle ANY clock can be set to the correct time, even one you draw on the wall ;).

Hans

Soapy Sam
3rd March 2011, 01:29 AM
Tick tock.

This is classic, narrow minded digitalism.

In the Real Universe (TM), analogue clocks emit a high frequency whistle, but you have to be going at relativistic velocities to hear it. We humans , in our slow, middle world time frame hear the quantised version.
In fact it's a routine experimental test for Venusian starship engines- they fly past Big Ben and adjust their speedometers using a microphone. Corrected for Doppler shift, obviously.

I Ratant
3rd March 2011, 08:40 AM
One of them coinkydinkys...
Last night in between dreams, I cogitated that prior to the invention of the railroad, Doppler shift was unknown.
And here we have it being used by Venusians!
Was Doppler a Venusian?

Andrew Wiggin
3rd March 2011, 10:37 PM
One of them coinkydinkys...
Last night in between dreams, I cogitated that prior to the invention of the railroad, Doppler shift was unknown.

You'd be right. The first steam railway was 1804, and Doppler proposed the idea that became known as the 'Doppler Shift' in 1842. I would think it was probably observed earlier, but without understanding the nature of sound as a pressure wave or light as an electromagnetic wave, the reason for the shift would be obscure. I can see someone on a medieval battlefield somewhere thinking 'I wonder why the whistle of the arrows the enemy shoots towards me is so much sharper pitched than the whistle of the arrows I shoot at the enemy? It would have probably been a japanese battlefield; they sometimes put whistles on their arrows IIRC for signaling, good luck, to terrify the enemy, and such.

Bill Thompson
3rd March 2011, 10:41 PM
Hypothetical speaking, I think an analog clock is more accurate since digital clocks always needs to jump from one timestamp to another, where analog travels the whole distance.

This is a cool topic.

Yeah, an analog clock is more accurate since time does not move in sudden jerks according to man-made intervals.

Ivor the Engineer
4th March 2011, 01:41 AM
This is a cool topic.

Yeah, an analog clock is more accurate since time does not move in sudden jerks

Maybe. Maybe not.

according to man-made intervals.

I agree.

OnlyTellsTruths
4th March 2011, 04:57 AM
wrong thread :)

BenBurch
4th March 2011, 09:45 AM
This is a cool topic.

Yeah, an analog clock is more accurate since time does not move in sudden jerks according to man-made intervals.

Analog clocks move in sudden jerks...

Only the pendulum has a smooth oscillation.

Same is true of the crystal circuit in a digital clock, though.

DavidS
4th March 2011, 07:19 PM
Analog clocks move in sudden jerks...

Only the pendulum has a smooth oscillation.

Same is true of the crystal circuit in a digital clock, though.
That's a pretty broad brush you've got there. Don't use it to paint your sundial. ;)

Illustronic
4th March 2011, 08:13 PM
My internal clock is the most accurate, I never have needed an alarm clock to wake up. Should I expect to have 2 hours of sleep or less for an important deadline, I set it anyway, and usually wake up before it goes off, several times just a minute before it did.

In my head is the most accurate (important) clock in (my) world.

Illustronic
4th March 2011, 08:15 PM
You are born, you die, and the wheels on the bus go round and round...

BenBurch
4th March 2011, 08:42 PM
Wrong thread, sorry.