Tuesday, February 12, 2008

Raymond Smullyan, Mentats, and Naivete

I've been reading a lot of Raymond Smullyan lately (2 books done, halfway through another one, and two more lined up once they're returned to the library), and halfway through the first book he was already on my all-time favorites list. I strongly recommend his stuff - there's nothing quite like it. I'm particularly fond of the way he points out the paradoxical nature of everyday life. It's gotten to the point that I found myself making up a hilarious conversation while I was shaving a few days ago. Here it is:
A: Some questions are not meant to be asked!

B: This is certainly a possibility, but then I must ask you: What are these questions? Why must they not be asked? And most importantly, are my last two questions meant to be asked or not?
I don't, as a rule, have unquestioning respect for anything, but if I did, this would have blown all my justifications of that notion straight out of the water. This probably isn't good enough to be worthy of a Smullyan book, but you get the idea. In his essays, he points out weird things about all sorts of notions, some of which we take for granted, others which we don't (though we might know people who do), and he does this with a type of logic that I can only characterize as naively innocent, insightful and incisive. (W00t - I made an accidental alliteration! And what do you know, in pointing it out, I did it again! mrgreen)

It's important to realize that when I say naive here, I mean it in a highly complimentary sense. I've recently come to believe that naivete is a very strong defense against intellectual garbage, and one should always adopt a naive mental posture when evaluating a new idea (old ones, too!). Your worst enemy in these matters is a huge accumulation of 'things I know'.

It's the same thing with those 'aha!' moments that we all have. One of my favorite quotes from Dune deals with just that. It's a quote from a text used by Mentats - the human computers (they're really far more complex than that simplistic description) of the Dune universe.
“Ready comprehension is often a knee-jerk response and the most dangerous form of understanding. It blinks an opaque screen over your ability to learn. The judgmental precedents of law function that way, littering your path with dead ends.
Be warned. Understand nothing. All comprehension is temporary.”
I've been thrown by my ready understanding over and over again, but thankfully I'm now conscious of it (thanks to a friend of mine who is charmingly naive and a marvellously efficient mathematical reasoner), and can fix it. Obviously you can't become naive, but you can certainly cultivate an attitude of intellectual naivete when integrating new concepts. I suspect that naivete informed by knowledge born of sophistication is a pretty deadly combination.

Incidentally, ready understanding is actually more dangerous the smarter you get. When a smart person understands something really quickly, he usually has some sort of approximation to the truth, or at least something that seems to work on the surface. A lot of the time, it'll work enough to seem right (this is what happened to me, in fact). But it stops you from considering complex concepts in greater detail, and makes it even harder for you to correct/improve your flawed understanding of them.

Does this mean you should sit down and consider everything you do with great concentration? I'll leave that up to you, but don't forget this Mentat admonition:
“Many things we do naturally become difficult only when we try to treat them as intellectual subjects. It is possible to know so much about a thing that you become totally ignorant.”
This applies to lots of philosophy and theology, for instance. Very large parts of these two disciplines (and they're not the only ones) are webs of nonsense spun out of over-intellectualization of simple concepts - either because this gets you famous, or because it preserves a fragile worldview that can't survive simple questions - so you make it survive complex ones and point to those instead.

Unfortunately, most of us lose this between childhood and adolescence, and we even wind up thinking that it's an essential part of maturity. I think it may have its origins in that weird social ostracism things that children practice - "Look, Timmy doesn't know how to tie his shoelaces! Hahahaha - what a dweeb!" - sound familiar? You can bet Timmy learned how to tie his shoelaces before the week was out, to the immense surprise of his mother who had been fruitlessly trying to teach him for several months.

But it isn't just shoelaces - you can take a whole bunch of nutty propositions and just render them 'true' by mocking anyone who doesn't see why they don't make sense. Just like in that story with monkeys and bananas. Repeat for a little while, and you've got traditions, customary practices, shared beliefs, stories...culture! Then you wind up laughing at the strange practices of a tribe in some remote spot in Africa, without realizing that your own traditions are the product of the very same bugs/features in human cognition that created the traditions you laugh at.

Fake sophistication is the second-most common commodity on the planet (next to wasted human potential). If we don't recognize the fact that many of our social practices are bursting at the seams with contradictions, we'll never be able to fix them. It doesn't help that many cultures subscribe to the silly view that their way of life will endure forever, is the one true way to live, etc, etc, etc. This simply isn't true. There never has been such a way of life, nor is one even possible. Some are objectively better than others. In other cases, the only way to make a judgement is to decide upon prevailing conditions and see how they perform relative to them. And conditions always change - not recognizing this is an ancient mistake, a minor mental adaptation from back when things changed so slowly that you could play by the same rules for a thousand years without any trouble.

This won't work anymore, though - not in a world that changes as fast as this one. If there's one lesson the human race should learn, it's this: Finished products are for decadent minds.

I wish that last line was original, but I feel compelled to confess that I filched it from one of Isaac Asimov's Foundation novels.

So in conclusion, go forth and be naive!

Friday, January 04, 2008

Metaphors, Holistic Learning and Digression

I just read an interesting article on Lifehack on the relationship of metaphor and learning. It happened to spark off a fairly detailed stream of thought, so I figured I'd blog about it. Give it a read first, it's fairly short.

Now, when I first read the article, I immediately liked it, because I've actually done a bit of thinking on stories and metaphor in a more personal way, applied to how I tend to remember things. Anyone who has had a long discussion with me (or who reads this blog) will notice that I have a very strong tendency to digress all over the place. This isn't just something I do while blogging. In most discussions, I'm not just explaining my point of view - I'm also feeling out new connections as I think. It's not a conscious decision - I just do it naturally, out of some subconscious drive or something. I'm the guy who always goes "This reminds me of..." and then I come up with something that no one else in the conversation has been reminded of. I actually realized how weird this was when a friend pointed out several years ago that I always got reminded of incredibly unrelated things during any discussions. I had no idea that it sounded odd to other people until he told me. mrgreen

(And that's not the only thing - I didn't realize my eerie ability to detect spelling errors instantly by merely glancing at text until my dad noticed me doing it and pointed it out. Maybe I'll blog about that some time. There's another thing that I didn't know about until someone pointed out, but since it deals with stories and memory, I'll talk about it in a bit more detail, but not in this paragraph. And that's because this parenthetical paragraph is really another digression...redface)

One of my programming team coaches pointed out last year that I often quoted something from some book I had read, and usually did so with extremely high levels of accuracy and detail that you wouldn't normally expect from someone making a mostly offhand comment. I suspect those details are unnecessary and boring, so maybe he was giving me a hint...rolleyes

At the time, I thought about it for a moment, and agreed with him. Later, I remembered that conversation again, so I thought about it a bit more. The best explanation I could come up with was that all those things I quoted often formed some kind of coherent narrative, or at least part of one. And as the article that inspired this post points out, stories are really easy to remember. On top of that, I've been reading very intensively since I was maybe five (clearly I'm obsessive), so I process stories better than less trained people, so to speak.

Why are stories easier to remember, though?

I suppose there are several reasons, but the central theme of them all is the same - the human mind is optimized for dealing with social complexity. Just ask anyone who is addicted to a soap opera, and chances are they'll be able to narrate everything that happened from the beginning without a hitch. We take that for granted, but consider what an incredible feat that is. You have a terribly unimaginative plot, characters that all fall into a small number of classes whose members can be (and often are, for the purposes of relationship drama) interchanged, and a bunch of phenomenally dreary interactions between them, drawn from a pretty small set, on top of that.

And now consider the fact that despite all that, no matter how much you hate soap operas, just watch a few episodes, and you'll have near perfect recall of what happened a few weeks later.

Now try reading three chapters of a bad textbook, and see what you remember at the end of the month.

Works the same way for songs - people who can't remember stuff they've read will nevertheless be able to sing hundreds of songs with perfect command of the lyrics, and they don't find it surprising. (Ah, if only we found more things surprising, the world would be a better pla....er, never mind) This happens because of the rhythm and repetition, the way the words follow the music, and the associated emotional experiences. Our brains are extremely good at processing vivid sensory and emotional experiences, not to mention music. There you go - instant memory aid.

Seriously, though this sort of optimization shouldn't really be surprising. Closer to my field, think about recognizing faces. We do it all the time - hell, we can tell twins apart with relative ease if we're around them for a while. But getting a machine to do that is phenomenally difficult. One reason it's so hard is that we expect so much - we can do it effortlessly, so we expect great things from our computers. Doesn't work, of course - we've got specialized brain hardware that helps us out, over and above our complex visual systems.

And now observe how this digression of mine comes around to something useful. There was an episode of My Brilliant Brain about chess grandmaster Susan Polgar, which showed how parts of the brain normally involved in face recognition lit up when she looked at chessboards. Sounds like a pretty good explanation for the phenomenal chessboard memory exhibited by chess experts, doesn't it? Faces are pretty easy to remember because we notice certain features and how they fit together. It's a form of chunking, really.

Interestingly, if you set up chess pieces in some configuration that won't ever occur in actual play, chess players don't do any better remembering those than normal people. Presumably it looks like a picture of a face cut into weird pieces and reassembled in ways that don't make sense.

So it should be obvious now why it'll be easy to learn stuff that you can map into a narrative of some kind. But what about all the rest? How does one map a mathematical proof to a story? Some things are just too far removed from our experience to 'narrativize', if there is such a term.

This doesn't mean you should give up there - far from it. There are all sorts of other things our brains are optimized for - you can use them too. For instance, muscle memory comes in handy when you're solving the Rubik's cube. It never take me more than 4 minutes to solve any configuration (unless someone is cheating by switching stickers around wink) and people who are watching always ask me how the hell I'm going so fast. Well, first off, I'm pathetically slow - the world record is 9.41 seconds, if I remember right. Secondly, once you learn a method for solving the cube, a bit of practice makes the component motions second nature, so you don't have to think about them anymore. That gives you a lot of speed, and it's no different from riding a bike or driving - stuff that you did consciously becomes automatic.

So that's one reason not to be discouraged about learning. Another reason is that you can always use a second-order effect to learn more easily. There are many skills that become automatic once mastered - stuff like basic algebra, for instance. The concepts involved can become the building blocks of more advanced concepts. So you expand the coverage your narratives have by embedding well-known narratives in them as atomic components. (Star Trek fans, remember "Darmok and Jalad at Tanagra" cool) Mathematicians do this all the time - as Von Neumann once said, there are things in mathematics that you don't understand so much as get used to.

It's helpful to try and spot underlying themes for both explanation and understanding. Most fields build up complex concepts from simpler ones, but when you're trying to grasp a lot of stuff, for an exam or something, it's not that easy, and many times it's downright unnatural. It's like thinking about quarks and leptons - sure, they're the building blocks of matter, but when was the last time you saw a parent telling a kid about the subatomic composition of an elephant? He'll say stuff like, "It's big, it's got a trunk and big ears..." and so forth. Sure, a particular configuration of quarks and leptons makes up an elephant, but that's not the right scale at which to think about your friendly neighborhood pachyderm! (Bit of an odd neighborhood, that...) So it makes sense to see how the big pieces fit together, because they're easier to deal with, and then go down to the basics.

Computer Science people, think about Automata Theory. What's easier to understand - Turing machines as a big tape with symbols on it and something messing with them as they go by? Or the formal description with transition functions and tape alphabets and what not? Sure, the second is more precise and fundamental, but the first is closer to human experience, so you can reason about it more easily. So you start with students using the first explanation, and then develop the second one, which gives them time to tie the two together. Eventually you can bounce back and forth between the two, and even use the tape representation as a thinking tool (What if I have two tapes? Suppose I could only go forward...Maybe there are only a fixed number of symbols allowed....). You can look at the other automata in similar ways, and there are multiple ways to tie it all together - the Chomsky hierarchy, the languages accepted by each class, the differences and equivalences an so on.

It's a little more explicit in CS and mathematics than in other fields, but hopefully I've made my point.

So the next time I start boring you with something totally unrelated, see if you can spot the weird web of connections between all the things I say. You might find it enlightening, and more importantly (yes, I chose this word with a great deal of care), I probably would too, if I wasn't consciously aware of the links and someone told me about them. biggrin

And in conclusion, I shall mention that I was suddenly reminded of something Eddie Izzard used in his stand-up routine Dress To Kill. He's poking fun at the way he jumps from one topic to another seemingly unrelated topic. So when he starts talking about The Great Escape, he explains it like this.

"So you've got a bunch of British actors, and I'm British - link up there. Steve McQueen, action hero - I'm an action transvestite, link up there." mrgreen

PS: I strongly recommend watching this guy's stuff - a funnier comedian I have never seen.

PPS: I just got reminded of something else that I've been musing about, which struck me as being related to this post - has to do with the AIs of the Golden Age scifi trilogy and how they think. I'll leave that for a later post.
biggrin

Thursday, January 03, 2008

It's freezing!

Looks like the weather gods are just begging me to blog by producing blog-worthy climate around Orlando. As I mentioned in one of the previous posts, there's a cold front hitting the general area, and so temperatures have dropped.

Yesterday's forecast on my nifty AccuWeather toolbar in Firefox was for about 11 C. It seems the estimate was off - I paid attention to the temperatures all day long, and they were never higher than 8 C, which is usually the average minimum!

It seems that my reaction to jet lag is rather strange - my daily sleep requirement decreases to somewhere between 5 to 7 hours, which is a lot lower than my usual 10. I experienced this for the first few days when I was in India too, so I suppose it's just the way my body adjusts now. In any event, just like last night, I woke up at around 5 AM, completely refreshed, and imagine my surprise when I fired up Firefox and discovered that it was freezing outside.

I mean 'freezing' quite literally - the temperature is 0 C! eek

This is the coldest weather I've ever experienced, so it's tremendously exciting for me. Especially because I'm comfortably ensconsced indoors where it's all heated and warm. mrgreen

A little googling showed me that this isn't unknown - the lowest recorded January temperature here is actually seven below zero, but that's pretty damn rare. It looks like I got lucky. rolleyes

Anyway, since I'm up, I might as well do stuff. Hopefully it'll warm up later in the day.

Wednesday, January 02, 2008

Of stacks and suitcases

As promised in the previous post, here is the interesting incident that took place as a result of sitting in the front of the plane near the window, and having way too much boredom to deal with.

In stark contrast to the previous flight from Mumbai to Newark, I was seated as far ahead as possible without being in first class - row 5 (rows 1 to 4 are in first). Sitting on the right window seat of row 5 gives you an excellent view of the suitcases being loaded - there's a conveyor belt heading into the plane and people in orange jackets tossing bags onto it. Bear in mind that this probably depends on the model of the aircraft, but this was a relatively small domestic flight, so you had a small plane with two aisles. A huge advantage in sitting right after first class is that they have a partition between first and the rest of the plane, and so the people in economy seated right after the partition have more legroom than anyone on the plane, including the blokes in first class! mrgreen

Now they had been loading up the cargo since before I got to my seat, but once I was there I started watching them to see if I could spot my suitcase, which is easily recognizable owing to the big fat red ribbon tied to its handle, to...er, make it easy to recognize. I did take along another tiny little suitcase as well, but that was pretty nondescript, which ironically made it easier to spot. This just goes to show that in the presence of large amounts of novelty, simple plainness actually stands out, thus rendering the nondescript somewhat conspicuous. And the fact that I write this demonstrates that I love silly little contradictions like this, contrived though they may be.

In any event, I was unable to spot either of the suitcases, which led to me to figure that one of the following had occurred: (a) They had missed my suitcase, or (b) They had loaded it before I got there to watch them. Given that (a) is usually rare, I figured (b) was more likely. Being a well-trained logically minded sort of chap, I also considered the possibilities that there were other loading points out of my sight, either further back or on the other side of the aircraft. However, after watching these guys for a bit, I figured that there were no such additional points, either (a) because of the annoyance of having to assign multiple teams to one plane, or make the same team switch places all the time, or (b) it would have been a pointless thing to do, and so probably be bad design.

At this point, being a big fan of meta-thinking, I also realized that (a) I seem to be considering a lot of alternatives in the form of alphabetically indexed lists, and (b) This is getting out of hand I should stop at once.

Anyway, being a nutty computer scientist, I observed that the behavior of the suitcase loading system was analogous to a stack - last in, first out, because they would have to bring them out in reverse order on the same conveyor belt, as far as I could see. Consequently, based on my hypothesis that my suitcase had been loaded at the beginning, I concluded that I'd have to wait pretty long for it to come out on the carousel.

What? I do programming contests - I have to notice little things like this. Sort of like how Chuck Norris always has to solve his problems using roundhouse kicks, as we all know he does.
mrgreen

So there I was, armed with a hypothesis and a testable prediction, and all I had to do was wait patiently, doze through the flight and watch an episode of the Simpsons, and then get off and experiment, i.e., wait again.

Sure enough, I was virtually the last person out of baggage claim, and so I received the perfect ending to a nice holiday, in the form of a successful application of the scientific method, the joy of discovery and all that...

Okay, perhaps that is a bit too geeky, even for me.
confused

Back in Orlando

(The following paragraphs were written while I was sitting at Newark airport feeling terribly bored and too bugged from travel to read anything.)

So here I am sitting at Newark airport at 8 AM on New Year's Day, waiting for a flight that will leave in two and a half hours. Having nothing else to do, I've found a convenient power outlet, plugged in my laptop, and decided to chronicle the last few hours of travel. I would have blogged straight from there, but in a display of startling silliness, they don't have free wi-fi! mad

I also have a strange feeling that my writing style is a bit off, for some reason. Perhaps my mind has just been scrambled by all those hours in the air, not to mention airplane food...

First off, I'm entitled to brag a bit, seeing as I'm now a member of the elite club of people who have celebrated the new year by flying west across the planet, extending the night several hours beyond its usual length. (Say, that's a great idea for a new vampire movie...razz) Unfortunately, instead of partying the whole way, I spent it in the confines of an uncomfortable airplane seat. Someday I'd really like to find out how much money they would lose by extending the leg room by a foot or so...

Anyway, thanks to Continental's amazing inflight entertainment system, I spent most of the extended night watching 4 episodes of House, an episode of Futurama, and all three Matrix movies. They've got an on-demand thing going, so instead of being forced to watch whatever they have whenever they have it, you just browse their list, pick whatever you want, and watch it at your convenience. This is obviously the way to go, and I'm a bit confused as to why I didn't have this on the flight I took to Mumbai. I didn't even realize Transformers was on until it was too late, or I could have watched that instead. I seem to remember vague plans being made by a whole lot of friends to see it when it came out, but nothing ever materialized.

I did sneak a bit of sleep on the plane, or at least I tried. I'm not entirely sure how long I slept, for a bunch of reasons. They had this flight map feature on the entertainment system which kept displaying a bunch of times - time at the destination, time at the origin, expected flight time left, and so forth. Left me totally confused, and I suspect I may have dropped off for several hours without realizing it. The stuff I watched only adds up to about 8 or 9 hours, which still leaves at least 8 hours unaccounted for. It did feel like my efforts to find a comfortable sleeping position lasted that long, but I can't imagine how I pulled that off without going nuts. What made it more annoying was the guy sitting next to me, who spent the entire triop in a state of blissful slumber, in the most astonishing positions ever. evil

Maybe I have gone nuts, and like any insane person, I don't know it.

At any rate, I'm fully awake now and not feeling particularly sleepy. My eyes don't feel strained, I don't have dark circles, I've got a fair bit of energy - whatever the case, it seems to have worked out. More on this when I get home.

Couple of interesting things - in the flight to Newark, I was seated in the last row, something which I've never done before. To be perfectly precise, it's not the last row for the center aisle - that has one extra row. So there are 45 rows in the center aisle, but only 44 in the left and right ones. Also, the last row has only two seats in the left and right aisles. Interesting bit of asymmetry, that.

And before I forget, it seems the ghost of Douglas Adams was roaming about Mumbai airport on New Year's Eve. The next time you get on a plane via an aerobridge, check out the numbers on the control display - it's just by the entrance to the plane, where the aerobridge connects to the airplane door. There's a little display marked speed (in cm/s, even). Now you'd expect that to be 0, but instead it kept on alternating between 00 and 42. Eerie, but in a funny way.

Verily, the Ultimate Answer shows up in the oddest places...

I'll end this here for the time being. I'm getting a bit stir crazy, so I'll probably move about, stretch my legs, do the Rubik's cube a few dozen times - that sort of thing.

(This concludes the Newark airport leg of this blog entry.)

I'm back in Orlando now, so all you keen minds out there are justified in concluding that the flight did in fact arrive without any problems. We left a few minutes late but managed to arrive on time, though I did have to wait a bit for my luggage. Interestingly enough, I was actually expecting this, and here's why. (Expect me to beat around the bush a bit - I love giving background information, for some reason.)

So anyway, I got home and immediately noticed that the weather was absolutely amazing. The clouds were out, so I didn't have to get annoyed by the sun in my eyes. Still, it was mid-afternoon, which meant the temperature was a very pleasant 21 C. There was a very slight breeze, so it was nice and cool too. In short, paradise. Also, the air was clean, so my nose was happy, or would have been if it were capable of human emotion.

Having noticed all this, I did the most obvious thing. I stayed indoors, checked my email, ate some noodles, watched some Eddie Izzard...

I avoided most of my jet lag until about 10:30 PM or so, at which point I found myself yawning. So I hit the sack and slept like a log until about 5 AM, at which point I woke up, watched the Four Horsemen of the Anti-Apocalypse, and remembered that I had a blog entry lying around, and so this came along.

It turns out that there's a cold front coming in - arrived last night actually. So, astonishing as it might seem, it's actually 5 C outside. eek Needless to say, in Orlando, this is insane.

I'm feeling slightly sleepy again, so I'll go lie down for a bit and see if I fall asleep again. Toodle-oo and all that.

Ah, and before I forget, happy new year to everyone!

Wednesday, December 19, 2007

Of overdue blogs, year-end updates, and NRI syndrome

So here I am, blogging again. As is usual, I shall preface the main body of the blog with some random remarks about why I haven't blogged for so long. Actually, scratch that - I was planning to explain why, but it's been so long that I can't really remember why I stopped blogging in the first place. So I'll refrain from boring everyone with some made-up excuse, and save myself the bother of having to think one up.

So I'm back home in Mumbai, albeit for a measly three weeks. Winter break is not very long, sadly. My mom desperately wants to fatten me up with home cooking (and of course I have no objection to this :P) but this doesn't give her much time. She has, however, enlisted everyone who sees me in a massive conspiracy that aimed at convincing me that I've lost weight, thus resulting in rather boring "He's become thinner, hasn't he?" lines from every second person I run into. Now what do you say to something like that? Ultimately, I decided to come clean and tell the truth - that this is all part of my sneaky plan to get fed tons of amazing home cooking.

I've come down with something a friend described as "NRI Syndrome". In a nutshell, I've lost my Mumbaikar immunity to the city's polluted air, and ever since we landed, my nose has been switching between running like a waterfall, or blocked so badly that it might as well be a rock stuck to my face. Nose drops are the only way out, sadly. I've never had to use them continuously for a week - two or three days is the outside limit - but this leaves me with no option. And now there's smog all over the place - when did that happen? It's like someone found out I left and gave them the all clear to take air pollution to the next level.

And when did the design of 2 rupee coins change? I saw one a few days ago and almost thought it was fake. It looks rather out of place among all the others, if you ask me.

Moving on, here's a quick semester recap: academically, it rocked once again. I passed the PhD qualifiers - first try, easy pass and all that. They don't tell you just how well you did, so all I know is that I had an 89 in Formal Languages and Automata Theory, and did 'extremely well', in Artificial Intelligence. No idea about Algorithms and Programming Languages, other than the fact that I did well enough to pass with ease. I'm one of 6 people who passed, out of a total of 12, which isn't too bad. Don't know how many of those 6 were on their first try - at least two weren't, as far as I know. In any event, that's one major milestone that's over with.

I've managed to get into the hallowed ranks of two-time ACM ICPC World Finalists again this year (our region is obviously way too easy), which works out very nicely for two reasons. First, because I've done almost as well as can reasonably be expected - two years of competition, two trips to Finals. Second, being a finalist twice renders you ineligible to compete again, which fits nicely into my plans, seeing as how I'll have more energy to devote to research once I find a nice dissertation topic, which I'll be looking into next semester. I've got several ideas buzzing around in my head, but nothing nearly concrete enough to nail down, and I think I need a bit more exposure to the prevailing zeitgeist. That means I'll probably be reading a bunch of papers throughout the next semester, which should be pretty stimulating. Of course I need to balance this with Finals training too, but I figure anyone in a PhD program needs to get as much mental training as possible.

This is also why I picked up a copy of God Created the Integers, which is a humongous collection (1200 pages, in really small print!) of the greatest mathematical works of all time - Euclid, Newton, Laplace, Boole, Gauss, Riemann, Godel, Turing...all the really big names, and edited by Hawking, with commentary. I was inspired by the way literary types often refer to 'the classics' - the works of the really great writers. These are the classics in my line of work - after all, it all boils down to mathematics in the end. It's not all that different from being a writer, I figure - you start by imitating the most amazing pieces of writing you can find, and then absorb whatever you like, eventually evolving it into your own style. This changes a bit for research, of course, you don't want to imitate, except in private for educational purposes, but I think it will be rewarding to try and follow the general reasoning of the really smart people who changed our understanding in fundamental ways.

Anyway, I'm going to go back to reading Euclid and Eudoxus now (old stuff, but still cool). Virtually a religious experience, I tell you...

PS: I'm planning to start blogging in earnest again. Hopefully the plan will survive contact with the next semester. Wish me luck! :)

Wednesday, June 20, 2007

In New York

I'll be in New York until Saturday evening for a little something Google has planned. Looks like going to the World Finals has its perks. mrgreen

Sunday, June 17, 2007

Random ramblings...

So I've had one of the most amazing holidays ever - and I squarely blame it for all the difficulty I'm experiencing in getting out of the vacation mood. mrgreen My parents and sister came over to the US, and we've basically been all over the place. Three days at Disney World, followed by Washington DC, New York, New Jersey, Niagara...and loads of shopping, of course. It was one hell of an experience, and certainly a vacation to remember.
biggrin

Since I'm not taking any classes this summer, I'm busy studying for the PhD qualifiers coming up at the end of Fall, as well as getting some practice done for the next ACM regionals (and World Finals, hopefully). Life is good.

I've recently been playing Jedi Knight 3: Jedi Academy to pass the time. Now I've always thought the lightsaber really is 'an elegant weapon for a more civilized age', and this game finally gave me the chance to be a Jedi. razz You won't believe the fun you can have deflecting blaster bolts back at your assailants and taking down a dozen opponents with a few well placed slashes and spins (or just pushing them off ledges with the Force). Ah, the carnage...very satisfying. wink

As anyone reading this might have figured, I have absolutely no issue with violence in video games (and besides, Jedi Academy doesn't even have blood - lightsabers cauterize as they cut, so limbs can get sliced off without any splatter rolleyes). Violence and aggression is an inescapable part of our simian ancestry, and modern society rarely has enough outlets for it. I strongly recommend violent video games as a prospective outlet.

There is an amazing sort of beauty to lightsaber combat, even though the games don't make it look that much like the movies. I'm particularly fond of what I term the 'samurai' lightsaber style, in which I basically wait for the enemy to attack, step slightly out of the way and then strike, timing it exactly to go through the opponent's guard. It took some time, but I'm now pretty good at dual saber combat. I've always had a thing for fighting styles that use two swords - Miyamoto Musashi himself did it with his Hyōhō Niten Ichi-ryū style, and it's just cool. With lightsabers, it gets cooler, because wielders tend to spin their sabers a lot, and it looks cooler when you're spinning two at once and ending with a double slash. mrgreen

Another thing I do is watch the incredible comedy of Eddie Izzard. It's a very different sort of comedy from what you normally see - it's a sort of Monty Python meets stream-of-consciousness monologuing, with a bit of mime and cool sound effects. This is one of my all-time favorites.

For the last couple of days, I've been watching videos from Beyond Belief 2006. I've only watched about 3 hours total, but these are some of the smartest and most insightful people around, and it's just incredible to listen to what they say, and often, how they say it. I'm slightly fascinated by the art of holding an audience spellbound, and I figure watching some examples of it will help - at least on a subconscious level. neutral

So far, the most interesting talk was by Neil De Grasse Tyson, who made some amazing points. What interested me (aside from the humor) was the novel angle from which he approached things. There was a little section of his talk called 'Naming Rights', and he used that to point out that things are usually named by the people who get there first - and those blokes are the ones who (at that point in time) have a sort of culture of excellence in that particular field.

For instance, a lot of the heavy elements in the periodic table have names associated with the United States - Americium, Californium, Berkelium - and all because they were discovered in the US at a time when there was a great deal of emphasis on that sort of thing among physicists in the US, and they got there first. The constellations have mostly Greek names, because they were the first to stick their myths in the patterns you could see in the sky. Two-thirds of the named stars have Arabic names, because the Arabs catalogued them during the Golden Age of Islam, among a gazillion other achievements. Life was good for them until about the 12th century, when a fellow called Al-Ghazali wrote a scathing critique called The Incoherence of the Philosophers, and effectively destroyed the Muslim world's one chance at a philosophical and spiritual Rennaissance.

It's rather sad to note that if the ethos of the time had encouraged skepticism, freethought and unfettered rational inquiry a bit more, we wouldn't be in the mess we're in right now. It took the other two Abrahamic religions half a millennium before they managed to start looking upon religion with a certain degree of irreverence, which is a very underrated achievement. Religion is fine, as long as it isn't taken too seriously, or interpreted literally.

I've always had issues with the more ridiculous aspects of religion - miracles, revelations, prophets, gods - the usual supernatural garbage that comes with all religions in some form or the other. I guess it's appealing to a basic flaw in human reasoning processes - and one which isn't all that benign. Tyson pointed out one sad effect of it when he showed several religious referents in the work of Ptolemy and Newton and Huygens. And without fail, whenever they would start praising something as the incomprehensible work of some creator(s) or what not, that's where they would stop discovering and creating. And then a century later, someone without 'God on the brain' would come along and take over from where they left off, and then God would vanish from that particular domain as well. Newton was able to figure out the motion of the planets, but he stopped at the question of why it was stable. This is where he invoked God, and that was it. God wasn't anywhere in planetary motion - that was understood. When people understood things, God would vanish from the picture.

It's almost terrifying how religiosity stopped the greatest genius in human history. It wasn't even an insoluble problem - Laplace came along and figured out that problem by inventing perturbation theory - something that should have been child's play for Newton, in view of what he did.

Divine explanations aren't really explanations - they're just a sort of agreement to stop thinking, wrapped up in flowery language, self-congratulation and a sort of righteous humility that is thoroughly misplaced, and rooted in insufferable arrogance.

"Right, we've understood all this, but here's a question that no one can figure out - and no one will ever figure out. Yayyyy, God."

If that isn't arrogance, I don't know what is. And on top of that, it usually accompanies mythologies that imply humans and Earth are somehow central to the universe, in terms of purpose and importance. Yeesh.

Anyway, I'm going to stop ranting now. Richard Dawkins is coming up next. biggrin