PDA

View Full Version : 1200 year old math problem solved by a computer scientist

becomingagodo
5th December 2007, 11:25 AM
http://www.bbc.co.uk/berkshire/content/articles/2006/12/06/divide_zero_feature.shtml
Dr James Anderson, from the University of Reading's computer science department, says his new theorem solves an extremely important problem - the problem of nothing.
I used to think that computer scientist had no understanding of mathematics at all, that to pass computer science you don't even need to know basic arithmatic. However, I was wrong as Dr Anderson has solved a really improtant problem.

But Dr Anderson has come up with a theory that proposes a new number - 'nullity' - which sits outside the conventional number line (stretching from negative infinity, through zero, to positive infinity).
Genius, if only I thought of that.

The theory of nullity is set to make all kinds of sums possible that, previously, scientists and computers couldn't work around.

"We've just solved a problem that hasn't been solved for twelve hundred years - and it's that easy," proclaims Dr Anderson having demonstrated his solution on a whiteboard at Highdown School, in Emmer Green.
The mathematical revolution is about to begin. I guess it takes a computer scienctist to produce the next big step in mathematics, now the problem of diving by zero is gotten solved mathematician can become more relaxed.

Computers simply cannot divide by zero. Try it on your calculator and you'll get an error message.
I wonder what happens now, does nullity come up instead of the error message.

By the way I was being sarcastic, how can someone this dumb get a degree in computer science? he even teaches at a university.
The sad thing is that I believed this for about two minautes, then I realized that this must be a joke or something. However, the person really believes he has done something improtant.

Normal Dude
5th December 2007, 11:30 AM
I agree. I mean, who the heck does improtant things anyways? Sounds like a bunch of nothing.

robinson
5th December 2007, 11:41 AM
This story is a year old. He is either genius or a lunatic.
http://scienceblogs.com/goodmath/2007/09/the_perspex_machine_superturin.php

Of course, just overturning the theory of computer science isn't grandiose enough. He also claims that this solves the mind body problem, explains how free will works, and provides a potential grand unified theory of physics. From Anderson's own introduction on his website:

The Book of Paragon is a web site that offers one solution to the centuries old philosophical conundrum of how minds relate to bodies. This site shows that the perspective simplex, or perspex, is a simple physical thing that is both a mind and a body.

The perspex can be understood in many ways. Mathematically, the perspex is a particular kind of matrix; concretely, it is simultaneously a physical shape, a physical motion, an artificial neuron, and an instruction for a machine that is more powerful than the Turing machine. In other words, a perspex is an instruction for a perspex machine that is more powerful than any theoretically possible digital computer.

Henners
5th December 2007, 11:44 AM
Does this mean that we're goingto need three-way logic?

Yllanes
5th December 2007, 11:50 AM

robinson
5th December 2007, 12:00 PM
Did I mention this story was a year old?

Complexity
5th December 2007, 04:00 PM
He's silly, you're silly.

Go see a doctor.

Unalienable
5th December 2007, 04:34 PM
He's silly, you're silly.
Go see a doctor.

Agreed, Mr. Turing.

But I do believe that computer scientists and mathematicians need to cooperate to pave the way for future revolutions in our understanding of math (and hence, Nature.)

As one of my college professors explained brilliantly to a class full of freshman computer nerds: "Computer science is not the study of computers. The computer itself is secondary; the computer is to computer science what the telescope is to astronomy."

T-Diddy
5th December 2007, 04:35 PM
Gee, now when I write computer code I won't have to write a line saying that if the quotient is zero it should vary a parameter and try again...

Oh, wait, for any machine programmed after 1970 this isn't really a problem, since every program that could get stuck (like an auto-pilot) has such a step in it alreay.

Zygar
5th December 2007, 09:16 PM
Bago, if you were a mathematician, you would understand that he didn't solve anything at all. Just came up with a name for something we already knew about.

LostAngeles
5th December 2007, 10:47 PM
I wonder if BAGO has ever had to play in the extended complex plane?

Gravy
5th December 2007, 11:08 PM
I was intrigued by the thread title, but it turned out to be much ado about nothing.

Professor Yaffle
6th December 2007, 01:43 AM
BAGO - there, there, you could proberly be really improtant one day too.

ingoa
6th December 2007, 03:03 AM
Quote:
But Dr Anderson has come up with a theory that proposes a new number - 'nullity' - which sits outside the conventional number line (stretching from negative infinity, through zero, to positive infinity).

That's sooo last millenium....

In the Eighties we got as result from an IBM 3090 when dividing by 0:

1/0 ----> Result: NAN

Which translates to "NOT A NUMBER". :D

JonnyFive
6th December 2007, 07:07 AM
I remember this story. It's worth noting that the stories in the news grossly misrepresented what Anderson did or was even trying to do. He was attempting to formalize the division by zero problem (in a different way than was already done), not find a solution to it. The problem was already formalized, and CAS systems already exist that account for it... also, the issue is well known in computer science, as several other posters have pointed out.

Basically, he developed a not-exactly-new way of looking at a problem that was already formalized in order to do something that has already been done in a slightly different way.

Bago, that you thought this was some amazing revolution in mathematics or computing (and didn't bother to actually check the details out) only shows how out of touch with both topics you are.

MRC_Hans
6th December 2007, 07:31 AM
Deleted. Unfriendly content.

Hans

skoob
6th December 2007, 08:15 AM
As one of my college professors explained brilliantly to a class full of freshman computer nerds: "Computer science is not the study of computers. The computer itself is secondary; the computer is to computer science what the telescope is to astronomy."Your professor should have credited Edsger Dijkstra (http://thinkexist.com/quotation/computer_science_is_no_more_about_computers_than/334131.html).

becomingagodo
6th December 2007, 08:42 AM
Bago, that you thought this was some amazing revolution in mathematics or computing (and didn't bother to actually check the details out) only shows how out of touch with both topics you are.
I was being sarcastic, read the bottom of my original post.

I was trying to say all computer scientist don't know basic mathematics. However, saying that I do now realized the person is not a good representative of computer scientist, due to the fact he is crazy.

JonnyFive
6th December 2007, 08:50 AM
I was being sarcastic, read the bottom of my original post.

That's good, I didn't see.

But then you go and say something even sillier:

I was trying to say all computer scientist don't know basic mathematics.

So, like I said, you are out of touch with both topics.

INRM
6th December 2007, 12:24 PM
So that Perspex / SupraTuring concept he created would prove that consciousness is part of the brain irrefutably? How?

And does the proof of nullity make the Perspex machine right by default, or just potentially?

drkitten
6th December 2007, 02:24 PM
I was trying to say all computer scientist don't know basic mathematics. .

Yes, you've been trying to say that for some time and on a variety of threads. Unfortunately, you don't seem to know enough about either subject to have an informed or credible opinion.

Normal Dude
6th December 2007, 02:38 PM
I was trying to say all computer scientist don't know basic mathematics.

That's funny, because I could have sworn that ALL of the upper-division computer sciences here require calculus experience.

lenny
6th December 2007, 03:40 PM
As one of my college professors explained brilliantly to a class full of freshman computer nerds: "Computer science is not the study of computers. The computer itself is secondary; the computer is to computer science what the telescope is to astronomy."

telescopes make visible what is really out there (even if imperfectly).
digital computers create more than they magnify (even if clarifying a few questions of counting)

Unalienable
6th December 2007, 05:02 PM
digital computers create
And what exactly do they create?

If you compute the trillionth digit of pi, did you create something? Or merely make something visible?

Put another way, if your telescope perceives a distant star which was hitherto unknown, did you just create a new star?

Your professor should have credited Edsger Dijkstra.
That detail is saved for the sophomores.

BenBurch
6th December 2007, 05:42 PM
Yes, you've been trying to say that for some time and on a variety of threads. Unfortunately, you don't seem to know enough about either subject to have an informed or credible opinion.

Sometimes we physics people become better software engineers than the CS folks.

LostAngeles
6th December 2007, 06:46 PM
That's funny, because I could have sworn that ALL of the upper-division computer sciences here require calculus experience.

http://www.registrar.ucla.edu/catalog/catalog07-08-199.html#pgfId-83046

The mathematics classes in the, "Preparation for the major," are the same as mine (which is math).

The Upper Division courses are dependent upon the programming courses for the most part. Some require engineering classes and many of those are cross-posted with engineering classes.

I won't comment on engineers and math, however... :D

DoubtingStephen
6th December 2007, 06:48 PM
Some say that Alan Turing (a homosexual) was a computer scientist, mostly people that have heard of him. There are even those who say that Alan Turing (a homosexual) was a mathematician.

Some might conclude that is is possible for a homosexual mathematician to be a homosexual computer scientist without necessarily being homosexual and stoopit at the same time. Really, saying all computer scientists are stoopit is really so gay.

LostAngeles
6th December 2007, 06:55 PM
This (http://fukung.net/v/533/derive.jpg) might help BAGO.

Terry
6th December 2007, 07:01 PM
I wonder if BAGO has ever had to play in the extended complex plane?

I dunno... was Reimann a homosexual?

LostAngeles
6th December 2007, 07:10 PM
I dunno... was Reimann a homosexual?

I'm not sure that's integral to the matter at hand. While the sum of BAGO's ambition is Reimann's hypothesis, we're really discussing division by zero. It's all trapezoidal, if you ask me. Simpson thinks so too. As far as BAGO goes, I believe Steven Tyler once sang something relevant...

Terry
6th December 2007, 07:13 PM
well what goes around is probably isomorphic to what comes around...

Scott1972
6th December 2007, 08:30 PM
Why are those pesky computer scientists always causing so much trouble?

I may have to disregard the zombies and start preparing for the computer scientist apocalypse.

jmontecillo01
6th December 2007, 10:03 PM
In modern programming, there is a difference between zero and null. Zero means just that, zero. Null means the absence of a value.

For example, database. The column that contains null, does not contain anything at all. It is not equivalent to zero. In fact, in most modern languages, if you try to access a field containing null value, you get an exception, not zero.

Nancarrow
6th December 2007, 10:08 PM
To be fair to BAGO he did say at the end of his OP that he was being sarcastic and did NOT in fact hold this story in high regard. However, most of us have missed the main point, which is to inquire,

BAGO, Have you seen a doctor yet?

SomeGuy
6th December 2007, 11:22 PM
I may be really dense, but having seen this in so many threads:

Why should Becoming A God 0, see a doctor?

LostAngeles
6th December 2007, 11:51 PM
I may be really dense, but having seen this in so many threads:

Why should Becoming A God 0, see a doctor?

jsfisher
7th December 2007, 12:00 AM

Are you referring to his or ours?

LostAngeles
7th December 2007, 12:30 AM
Are you referring to his or ours?

His. He seems to take them as a mark of genius or something. If that were true, I'd be Valedictorian.

Really, he needs to see a doctor.

JonnyFive
7th December 2007, 05:43 AM
As far as BAGO goes, I believe Steven Tyler once sang something relevant...

Complexity
7th December 2007, 06:50 AM
I may be really dense, but having seen this in so many threads:

Why should Becoming A God 0, see a doctor?

Unmanaged obsessive-compulsive disorder, evaluation for other psychiatric disorders, serious headaches.

LostAngeles
7th December 2007, 08:58 AM

No.

"Reimmmmannnn!" (say it aloud)

I can't (thankfully) take credit for that joke. Back in Calc 2, when the professor started Reimann integrals, my friend asked if it was like the old Aerosmith song, then started to sing. He was awesome. He also, in response to Newton's method, announced, "Newton was a hack, sir!" :D

JonnyFive
7th December 2007, 09:09 AM
I can't (thankfully) take credit for that joke. Back in Calc 2, when the professor started Reimann integrals, my friend asked if it was like the old Aerosmith song, then started to sing. He was awesome. He also, in response to Newton's method, announced, "Newton was a hack, sir!" :D

Sounds like good times to me. :)

lenny
7th December 2007, 02:06 PM
telescopes make visible what is really out there (even if imperfectly).
digital computers create more than they magnify (even if clarifying a few questions of counting)

And what exactly do they create?

If you compute the trillionth digit of pi, did you create something? Or merely make something visible?

Put another way, if your telescope perceives a distant star which was hitherto unknown, did you just create a new star?

i'd agree your "make visible" is nice phrase for what the telescope does almost always, and what any computer does in the case of the trillionth digit of pi, or when solving other maths problems of similar ilk. using a computer to do an exact calculation for you quickly is what i meant by "a question of counting".

but astronomers and physicists, and even applied mathematicians, tend to use computers to simulate things, which is rather different.

ten different mathematicians using ten different computers will independently get the same first trillion digits of pi (even better, they will all agree on who is wrong if anyone differs).

ten different physicists simulating a rotating thermally driven fluid will get ten answers that differ after the first few bits: those that differ is what the computer "makes up".

as each digital computer is a finite state, deterministic machine, so a purist can argue that each is "just" magnifying a particular set of integers, but those made up bits are quite different from the trillionth digit of pi. no?

Unalienable
13th December 2007, 09:11 PM
ten different physicists simulating a rotating thermally driven fluid will get ten answers that differ after the first few bits: those that differ is what the computer "makes up"

...

those made up bits are quite different from the trillionth digit of pi. no?

I see the distinction you are trying to make, but I think it is illusory. The biggest difference I see between the job of calculating pi and the job of simulating fluid dynamics is simply that one problem is more well defined than the other.

The computer, or Turing Machine, is just a set of states combined with rules that dictate which state lead to another. (Not a formal definition but a pretty good summary.) To a mathematician or computer scientist, all states are essentially the same: they lead to another state, which leads to another, which leads to another, until our computer either terminates or goes into a loop.

When I have the movie Casablanca playing up on my computer from a compressed format, and Rick is about to say "We'll always have Paris", that's one of the fabulous states that my computer can be in. That's the human side of me. The computer scientist in me sees this gigantic bag of computer states, and I've just reached into the sack to pluck out state number n and now I am letting my machine process the sequence that starts at that state.

So ultimately I believe that as far as computers go, calculating pi, predicting fluids, watching old movies, it's all the same stuff: slavishly hopping from one state to the next, from a giant set of possibilities. Yes, I know it sounds boring--but I still cry at the end of Casablanca.