Saturday, December 24, 2005

Rise and Snooze.

I recently installed 3D Studio Max 8 and wanted to test drive the new version of Mental Ray renderer. It's supposed to produce better quality renders even with a low samples setting (compared to the previous versions), and faster. So I took my Bedroom scene, which I call "Rise and Snooze" just to explain away the young lady's laziness, and tried my rusty rendering skills on it.

For the renderer, I set the settings thus:
-GI: 500 Photons/sample, 100000 Photons/light, Max 2 Reflections, Max 2 Refractions;
-Final Gather: 50 Samples, Max 2 Relections, Max 2 Refractions, Max 1 Bounces;
-Renderer: Mitchell 4x4, Min 1 Samples/pixel, Max 4 Samples/pixel;

(Max. Bounces is new to this version and it controls the maximum number of times a difuse ray is bounced in the scene.)

The angle below took 40 minutes to render. I don't have the figures with 3DS Max 7 to compare this with. Nor can they be compared because of the relatively small number of samples required to acheive the same result in 3DS Max 8.

Click for slightly larger version


The lady is 'Jessi', a biped from Poser6, made to lie in that pose. Making her was easy, importing her to Max was, to say the least, tough. And the bed spread is my first Reactor cloth. Other than these, the room's pretty basic. When I say basic, I mean that it is made of cuboids and cones. I am a simple man.

;;

Sunday, December 11, 2005

Semper fi? Don't think so.

I had high hopes for Doom the movie when it was announced almost an year ago. Sure, the cast was only mediocre (except for The Rock, of course), and so far no video-game-to-movie movies were effective in acheiving their intention.
But this is *Doom*. How can anyone mess up? So, my expectation for this film was quite high. I had kept away from spoilers for this movie, and the only spoil I heard was of the short FPS sequence that was supposed to be included in the film. If anything, my anticipation went only higher with this news.

I finally got to watch the movie on DVD. Even just a few minutes into the movie, I could tell that none in the production department had played Doom or even knew its storyline. That, or they simply thought that the tried and tested plot of gene mutation monsters was better than Doom's hell portal theory. Sure, there were a couple recognizable Doom creatures like the zombies, the Imp, the Hell knight aka the Baron, and everyone's favorite Pinky, and the story takes place in Mars. But apart from that, it's definetly not Doom.

The main story in Doom, or atleast Doom 3, is about an ancient civilization on mars who research and succeed in inter-dimentional travel via teleportation. But they find that they had opened a portal to actual Hell. Hell demons start to invade mars. The martians create an artifact called the 'Soul Cube' through sacrifice of almost all of the mars population and arming their strongest warrior with it. The legions from hell were beaten back by the warrior. The remainder of the Martian population teleported to another dimension/galaxy, while a few others teleported over to Earth and became the first humans. The martians leave behind a detailed documentary on stone tablets, of their opening of the portal, the evil that followed, the usage of the soul cube, it's location and how they advise other civilizations not to try this in their backyard. The Union AeroSpace Corporation, UAC, establishes a research facility on Mars and contemporary scientists, upon finding the tablets, attempt to recreate the portal opening. They succeed but they are not aware of the Hell-concept. So, many scout teams and researchers, no matter what they are armed with, return either as dead bodies, lobotomized zombies with extra-large blood lust or as a giberring wrecks. Dr.Bertruger, the lead Evil scientist at the base, leads on the research no matter what the body count. This is when you get enter the UAC facility. You were a Marine who disobeyed a direct order to gun down innocent civilians. Giving a fatal punch to your CO is not a very good way of disobeying. The CO gets military burial while you get your can kicked over to Mars (on the theory that your body will naturally follow it). Dr. Bertruger has plans of his own. He goes through to hell, gives the soul cube to the Devil there and reaches a pact with them.
From here on, though the game becomes schwarchenegger-style-kill'em-all, it's the attention to detail on the horror element in Doom that really makes it Doom. For exampe, I remember one time when the light flickers for a few second and then ther's an Imp just behind the iron staircase. But when the light came back on, there wasn't anything there or behind me. All this I noticed on the corner of my eye. I still have't worked out if that was actually scripted, my hallucination or a bug in the game's clipping code. (Actually, I replayed the game and saw the Imp again. So it *was* a scripted event :) )
All along the way, you hear distant voices of your fellow marines getting acquainted with the demons, see things that aren't there, witness giant bloody growth all over the base, humans, disfigured and dismembered, arranged in some satanic art, cries of babies and women pleading you or beckoning you. There's just too many scare devices to count. And then there's the no-scotch-tape-on-mars syndrome, which means you can't mount the infinite battery-life flashlight on your weapon. A shame when you consider that powercuts are virtually permanant on the base. The creatures from Hell are a bit too demanding on the maintenance crew, I guess.

I don't expect a Fraped version of the game play relesed as movie, (Mainly because it would be nauseating and boring without the interactablity of the game, but actually because there are such videos of Doom3 and other games on the net. They are called SpeedRuns of the game. But unless you've played the game well, you can't follow the video or it's intent. 100% runs, in which the gamer finishes the game as fast as possible while collecting all the collectibles and killing all the killable NPCs, are hard to come by. But they would give you some idea of what the Doom experience is all about. VisualWalkthroughs is a great site for those who want a low bandwidth Doom experience), but I certainly expected the movie to exceed my expectations. You know, get creative and novel and stuff.
But no. There's the old gene mutation theory in cloak, the human genome containing the 'blueprint' of the soul (?), cliched one-liner dialogues, despicable characters whom we don't care if they rot in, well, mars, a heroine (is she?) who can do autopsy on hellish monstors but annoys viewers by her constant screams upon viewing a zombie lobotomizing itself on the other side of a bullet-proof glass, a civilization that can travel at the speed of light but still uses guns and metal bullets, ...

The movie in question is not bad at all, as a movie. It has a weak-but-OK storyline of the kind you would expect of an action movie, an OK cast (The Rock only sways rather than rock in this movie), an occasional nod to FPS gamers (Dr.Cormack, BFG, the FPS mode), couple of catchy one-lines ("Semper fi, mofo!", "Kill them all, let God sort them out."). I did root, root when the FPS mode started (About 3 minutes long), and I did watch the movie about four times. Infact, if they had named the movie something along the lines of 'non-Resident Evil' or 'YARE (Yet Another Resident Evil)' or 'Resident Evil in Mars', then I am sure that the movie wouldn't have recieved so many sighs from gamers and film critics.

Being a marine, Dwayne Johnson flouts the US Marine corps motto, "Semper Fi" (actually, "Semper fidelis") all over the place; whether it be on his extra wide shoulder tattoo or when talking down his adversary in the final boss-fight ("Semper fi, mofo!"). The exact meaning of the word is "always faithful", but, as Sarge, The Rock translates it as "Faithful to the core". But well, I don't think this movie is faithful to the core at all. It just isn't Doom.

Thursday, December 01, 2005

WTF - Event 2

Well it seems that my problem may be caused by Tune up utilities 2006. It was probably messing some registry value, which I found the hard way. And since Windows is a tangle of inter-dependencies at the level of the (badly designed) monolithic kernel, it became inexplicably crippled. But since I wasn't wise about the TuneUp Utilities causing trouble, I installed the software again after Event 1 and restarted Windows. There was the same olde problem staring bleakly at me. So I just popped-in the XP boot CD and started the installation.

But hey, if the setup was uneventful, there's no point in the post! All for no (obvious) reason, I had to restart the setup forever because of an angelic error dialog (more like a monologue) :

Fatal error:
Setup failed to install the product catalogs.
The signature for Windows XP professional upgrade is invalid. The error code is 426. The service has not been started.


Microsoft support site reckons that it’s cause by a catalog folder left behind by the previous OS installation. So the support article asked me to press SHIFT+F10 at the error message to get a cmd prompt window. From here I am supposed to cd to windows\System32 and rename â€ï¿½Catroot2’ to â€ï¿½catold’. It’s cool to know that you can access cmd prompt from here. It isn’t very useful but at least it’s more functional and workable than the recovery console.

But the thing is, the time between the start of the â€ï¿½Installing windows’ mode (the shift+F10 trick starts to work only in this mode) and the appearance of the error dialog is about 10 seconds. I raced against the clock to delete/rename the damn catroot folder, but it was a no-go situation.

Tired of doing the catroot thing, I ran bootcfg.exe and lowered the amount of RAM visible to windows to only 128MB of the 1GB. Of course, I know this was a long shot and won’t work, but some setup related problems with lower versions of windows go away with this trick.

I tried to move some important files to my other partition. Next I tried running the setup.exe in the Windows setup CD from within the cmd window. And it launched the setup! But, unfortunately, the “child setupâ€ï¿½ also stopped at the same spot as it’s “parentâ€ï¿½, with the same error message.
In the end I had to do a quick format and fresh install like I did in my previous post. I could’ve avoided this format by deleting the entire partition of Windows related files, but it would’ve smelled like a boat-load of bilge water to find some file I forgot or something. This time around, I've steered clear of tuneup utilities. But how I miss it! But hopefully, this the end of my 'formatathon'.

PS: On google, as of this writing, my blog is the only hit for the word 'Formatathon' :)


Friday, November 25, 2005

May you live in interesting times.

Nicolas Copernicus had shown that our home planet, the Earth, is not at the centre of the solar system, but rather a relatively small planet, lost among ranks of a sundry group. Any hope of the concept of 'speciality of human race' (anthropocentricism) that remained among certain fanatical philosophers, was constantly put under stress as facts like our earth not being at the centre of our galaxy, our galaxy not being at the centre of The Local Group of galaxies and our galaxy cluster not being at the centre of the universe, were dicovered successively.

The Copernican theory, in itself being profound, induced more spectacular scientific cornerstones to be created. One of those is the prediction, discovery and subsequent mapping of the Cosmic Background Radiation. Another, not very known, observation is by Richard Gott, who used the non-speciality of an observer's view of an event to predict the longetivity of the event. Of course, the theory, being based on mathematical constructs and not on some vaticinatoring knowledge, there's always a confidence level associated with the prediction. This prediction works only so long as the observer is observing the said event at its non-special time of existence.

To understand this wierd condition (non-speciality), an understanding of the Standard Normal Curve is in order. If you are thinking, "Math? Run like hell!", then don't bother running. I am only going to tell you what I know, which is more like English

If you make a large number of observations which have some true random noise, then the results tend to arrange themselves in a bell-shaped curve called the Normal Curve. The curve shows up no matter the range or the data being observed. The curve aligns itself over the data with it's peak or apex right on top of the average of the observations. If u take a chunk of the curve about the central line of the curve, then the area of the chunk gives the confidence level in percentage that the next observation lies within this region.
So, for example, given a range from 5 to 15, with average at 10, you can predict with approximately 0% confidence that the value is 10, or with 68% confidence that the value lies in the range of 9 to 11, or with 95% confidence that it lies within 8 to 12, and so on. But the requirement is that the randomness involved be really random, and not be because of special circumstances that can be accounted for.
Note that, you cannot, however, predict anything with 100% confidence that the value lies inside the range of 5 to 15. This is because the curve's property of never "touching down" at any value. It just goes on to infinity on either side. I guess that if it didn't, then we would all be cassandras (and her male counterparts).

Now that the primer has been laid down, here's the actual content. Richard Gott says that if you are an observer who 'just happened' to observe an event/entity at a non-special part of the object's existence, (that is, neither on the day of its conception not on the day of some event (like war) that could destroy the entity), then your observervation can be mapped in a standard curve which stretches from the entity's creation to it's (future) destruction. And, considering that your visit is non-special (for example, you didn't invited to witness its end), then this situation bears no significant difference to the curve in the above paragraph. Which means, we can apply the same prediction-with-confidence-level trick here too. The resulting prediction will tell you, with some confidence, in which percentage of the entity's lifetime you are observing. From this data, you can predict the entity's lifetime, again, with some confidence.

The title of this post, 'May you live in interesting times', is supposedly a chinese curse. I don't know. But this post gives it a whole new evil dimension. Since you are required to be an observer at non-special times of the entity to make a prediction about it, you can conversely say that the as long as you can predict the longetivity of something, the entity will remains relatively unchanged. Take the act of predicting the longetivity of humanity. If you are eligible to predict, then you are observing it at a non-special or non-interesting time. But if u are living in interesting times, then you can't predict anything because you belong in humanity and something big (possibly bad?) is going to happen to it. Give me boring times any day!

A go at Google:
So now, how about we predict something? Something like Google's existence? Googl has been around since September, 1998. I am making this observation on November, 2005. Since there's nothing special about me observing google today, I can say with 95% confidence that Google will last for more than 2.25 months but less than 281 years.
You can predict the logetivity of other companies, human race, organizations, your relationship or how long your college/university will last, if that will give you pleasure.

Monday, November 14, 2005

Dear Suse,

Image
A week ago, I went to ConneXions, Annanagar to buy some interesting book and other miscellany (including a cool mouse pad, whose review will follow). I noticed that Suse 10.0 OSS version dvd was bundled along with the November issue of Linux For You magazine. Since this was during my XP formatathon, I purchased it to try it out.
I installed it in some 8GB of unpartitioned space I had. I don't think there's any point in talking about the installation part, as linux installation, for the past couple years, had been simple enough to go through (except for some wierdo distros aimed at sooper-nerds). Maybe it could've been worse if I had tried to install linux on my raid-0 array. Infact, suse told me that I have a software raid contrary to what my BIOS informs her! Whatever. Since I didn't want to install in the raid partition, the install went uneventfully (if u don't count successfully installing linux as an event, that is). For those who want to have a look at the install though, here's a flash movie :)
One thing I noticed at boot up was that now there's a splash screen with progress indicator. Newbees rejoiced, as they would've found a bunch of growing dots less intimidating than the pages of scrolling dos text à la hollywood style or a progress barless screen that I was used to. Of course, this feature might've been introduced in earlier versions and I only noticed now. But I was annoyed that some things stayed the same. I mean, even after so many years since I last checked out linux, why should I still manually enter the monitor frequencies when I don't have to do so in any of the windows OS?

Yast2
Imagine having to know how to extract vcd frames from a vcd just to watch a movie. Many geeks reading this may be able to do so. But you are not the main market for the vcd industry. The average joe who just want things to work, are the ones, fortunately or unfortunately.
Following this analogy, I would say that linux is a right pain when you want to install anything in it. With utilities like Yast, this has almost been eradicated. Yast2 was good. The apropos style searching for programs and drivers is appreciable. Only thing that you have to do manually is to seek out installation sources: the path to software packages on the Internet or local disks. This step too could've been automated with something similar to GWebCache in gnutella so that we get only the recent list of servers hosting some essential packages, or atleast the packages that come with the DVD.
Another thing with Yast was that, by default the installation source had one entry named 'cdrom' something. But it didn't search my DVD drive at all, even though that's where I installed it from. Later, when I added "dvd:\" or something like that (after googling), I was able to search for some dependencies in the DVD drive. Also, it took minutes to refresh from all the Installation sources (3 sites and dvd). Maybe because of slow sites.
The first thing I did with yast was to download Nvidia driver for suse. I would've had a tad trouble finding drivers for my old ATi card (9700pro), but, thankfully, NVidia pays equal importance to linux version of its drivers. With or without the drivers, the way linux renders everything onscreen has been, somehow, delightfully different. I tell you what, you take the same webpage and view it in linux and in windows and compare the quality for yourself. My friend Balakrishnan used to say the same thing of linux when he tried to make me abandon windows forever. Not just the visuals, the sound is different too. And again, it's delightfully so, as the volume goes higher and sounds more amplified without any jarring artifacts. I couldn't believe it was the same computer that ran under windows the first time I ran linux and I've just rediscovered that feeling.

Multimedia keys and Mouse buttons
Other things that I've gotten quite used to in the windows world, but couldn't figure out in the linux "woerld" were my keyboard's multimedia keys and my mouse's forward & backward buttons [link]. There were tutorials to make the mouse buttons work by changing the mappings in Xwindow. I followed the example for a 6 button mouse, but it turned out that my mouse was 7 button. I haven't gotten round to try the new mapping yet.

The same's the story with my keyboard special keys too. Tried quite a few advises from various forums, but all of them had wireless multimedia keyboard and none of their advice worked on my wired one.

AmaroK
I love iTunes (you won't like it only if u haven't tried it). There's, of course, a Mac version too. But sadly, there's no iTunes for linux, yet. Just what Apple means by it, I don't know. I searched around and found that crossover office users can now enjoy itunes on linux. But I found a worthy temporary place-holder for iTunes, called Amarok. I could connect to my last.fm account using its intrinsic plugin. It uses recommended music based on my history of songs in the 'dynamic mode'. Kinda Ok feature, but tends to annoy you by repeating songs over and over. It's got an OK interface. With so many features spread over horizontal and vertical tabs, it could tend to be complex and non-intuitive. For example, I can't figure out how to “uncheck” a song to keep it from playing. And it's context menu could do with some more entries, like what iTunes has. But I loved the automatic lyric display (based on some OSS plugin which lets you submit lyrics), artist/album/title search (using some interface to wikipedia) and context functionality that displays the various context in which the current song appears. I would love to have some of these features in iTunes.
I couldn't find any codec installed by default that would play my mp3 or mov (quicktime) collection. So I had to install a windows codec collection called w32codec-all that really did contain almost all of the plugins necessary. My media collection is in a space-sparse 80GB harddrive in NTFS format. Amarok had a very good album art manager which, unfortunately, couldn't write to the NTFS media drive. The last time I checked, there was only support for over-writing files in NTFS. But maybe I would've found a fully supported ntfs write driver if I had searched for it in Yast.

FireFox
The onboard nvidia ethernet card was detected correctly. But I was looking all over to setup a ppoe conection. Finally found it under the ATM devices setup. A minor observation, and nothing more, in firefox is that, double-clicking the address bar in the Windows version selects the text till a delimiter. But in linux it selects the entire text. It was a bit annoying as I was used to do this to quicky select and change the last value in the url (like, http://forum.com/?forum=32) to quicly move among sub forums.


At the end of the day, Suse's Yast rules and the new versions of Gnome and KDE look promising. Though I am sure I haven't enjoyed anything that is exclusive to Suse 10 version, it will be the future patches, apps and derivative distros released by the OSS community that will make the impact of the OSS decision felt better. Already there is an OS effort called SUPER (Suse Performance Enhanced Release) to release a slimmer and nimbler Suse. I only checked out the gnome desktop. Many seem to favour the KDE environment, but I find gnome to be less cluttered and simple.
But I am still not using linux as main OS because I don't want to hassle with making visual studio, my games and other apps that I use to work in linux. It's the general laziness prevalent among windows users.


Sunday, November 13, 2005

Charles Darwin in Chennai?

Annanagar Talk, a local news paper, reported that on Nov 14, the great-great-grand daughter of, none other than, Charles Dariwin himself was going to do a book reading session here in Chennai. And specifically, in Annanagar, my neighbourhood! She was also supposed to have an interactive session after the event.

In all my excitment, I forgot to glimpse at the title of the book she was to read out. Well, I did glimpse at it later and it was "Tigers in Red weather".

Ruth Padel, as it turns out, disappointingly to me, don't talk much about Evolution in her book or in her after-book-read-interactive-session. Rather, her book is supposed to be a travelogue of sorts delineating the various places tigers live and focuses on the "physical, scientific and political significance of the tiger" and is versed well for the 'spiritually kindled' and the 'poetic-eyed'. Old people, in other words.
Shame. I was hoping to get a glimpse of Charles Darwin's spirit in her eyes as she builds on her great ancestor's ideas. Maybe He still glints in her genes, but I defer the chance to meet her if she's going to ramble on her books only. I've got more significant tasks planned for the evening. Like going to a Food fest FoodPro2005 at the Chennai Trade Centre, for example. I always wanted to see a chocolate fountain.

But anyways, it great to hear that she will be around in my neighbourhood and she'll always be welcome.

Thursday, November 10, 2005

Windows Themed Format-athon (WTF?)

When you get a BSOD immediately after login, you know you messed up with some startup service or file. You also know that long and dreary nights of non-fun filled activities involving smuggling your precious data files away from Windows' reaches, formatting and reformating, lie ahead.

Of course, one should not ignore the fact that Windows normally does not act abnormal without due provocation. But the causes sometimes are so varied and takes time to brew that you just can't point at a single software/file. I usually leave my favourite BitTorrent client, Azureus, downloading all night. Recently, due to the less than fit power lines (kudos to the torrents of rain), the torrents of files would stop because of abrupt power-cuts at night. Couple this with a slightly under-rated UPS and WINDOWS, et voila, you've got BSOD.

I also suspect my FlyAKite OS skin that I recently installed. But I am not so sure of that because I logged in via safe mode and tried disabling all services, uninstalling the said skinning software, moved mouse to usb, disconected some usb devices, moved the mouse back to ps/2, tried the "Most recent settings that worked" in the boot menu (though, as I understand, it will only get you to the login screen) to no avail.

Windows was beeping very strongly at me: "REINSTALL!"

So, reinstall I did.

I wanted to make this into a "Newbee Desktop Windows XP User's Guide to Reinstalling Windows Without having to Rebuild Data" (NDWXPUGRWWHRD). So here goes:

First off, it was Safe mode again to backup the "Document and Settings" folder and some python source files.

Before I had my this 320GB worth of HDD space, I used to allocate the bare minimum of space to the Windows partition. Usually lesser than 5GB. Windows XP has a special folder called "Documents and Settings" (D&S) folder to encourage applications to store all their data files like save games, settings, cache, user profiles, etc. I used to rage over this idea. I mean, if I asked a program to install in one folder, why does it store its heavy data files in the windows partition that is already bare on free space? But now, it makes sense, somewhat. When you want to reinstall the program files later when you reinstall Windows, or when your partition that contains the 'runnables' get deleted, you can still reinstall the programs and continue working (or playing, incase of games) as if nothing has interfered. So the lesson of the day is that "Documents and Settings" folder is good. Or is it really? There is a possiblity that your Windows partition hard drive may crash. In this case, _all_ your programs' settings are lost. This is the problem of centralization. But with HDD crashes becoming almost negligible these days, I don't see a significant threat here.

Unfortunately, it was not all smooth sailing for me. It was quite sometime since I last partitioned my HDD and this led to some partition confusions. First of all, I had an Active partition on the second HDD where I keep my Linux OS (which I hardly use). But the Windows partition lies in the first HDD's partiton which I keep as the first boot drive. I wanted to format my old windows partition and also awad the active partition status to C:\. For this I used DOS Partition magic since this was a raid-0 partition and can't be recognized by a dos boot disk. But when I formatted and started the install, my raid array was in disarray: *Who formatted my d: ?* (which was on the same drive as my windows was in) D'oh! That'll teach me to do these things while nodding off at the screen.

Fearing more damage and possible incompatiblity with win98/linux installation, I left changing the Active status for a later time. But now windows won't install. It kept saying

"To install Windows XP on the partition you selected, setup must write some startup files to the disk:
<disk name>
However this disk does not contain a Windows XP compatible partition."


The obvious solution was to make the RAIDed partition as Active. But I had another idea: disable the 'active' hard disk and keep only the RAID HDD. It worked and no more whining from XP that the partition is not XP compatible!

I must mention this little quirk with ASUS raid driver disk. A file called idecoi.dll (IDE Co-Installer) was not getting copied onto the windows folder during install. It seems this file is faulty in the asus's motherboard cd! Since I've installed Windows on my system before(duh!), I knew that this problem exists and that the idecoi.dll's absence don't worry any app. I even renamed a gif as idecoi.dll and copied it onto the system32.dll as place-keeper and later replaced it with one found in the website.

There was a rather pleasant interlude with one Suse, looking captivating in brilliant green. But that is another story!

With Windows XP back in the box and the remaining HDDs re-enabled in the BIOS, it was time to reinstall the applications to register themselves in the registry. But the D:\, where I installed applications, was gone and formatted (all 139GB of it!). So I downloaded and installed some recovery tools. I had more than installed programs in this partition: 3D studio Max files, video and movie files, some audio, some ini files, renders, etc. I tried "PC file Inspecter"(free), "Zero Assumption Recovery"(paid), "Recover My Files"(paid) and, based on a tip-off, "File Scavenger 3.0 (paid)". In the end, it was "File Scavenger" that worked the fastest and best among the others. I would definetly recommend it for NTFS-based recoveries.

Apart from backing-up the D&S folder, I also backed-up some settings using the *Files and settings transfer wizard*. It does more or less the same thing as backing up the D&S folder, but in a MS kind of way. So, soon after I recovered my d: partition, I reinstalled the programs and started this wizard to import back my old settings. But, alas, it goeth:

"The location you specified does not contain stored information" (!)

no matter if I pointed it to the same folder, its immediate parent or it's child folder.

Even my favourite iTunes's recovery system was behaving bad. iTunes has the option of saving the entire music library (including rating, playcount and other plaer-added meta details) to a special "song list.txt" file to be restored later. But on importing this list, I get:

"The file cannot be imported. Unknown error occured (-50)"

Reason: No idea! But since iTunes saved its config and song list in the D&S\My Music folder, I was able to recover it anyways. I just copied this folder over the one in windows as a whole(A mistake as I later realized), and we have normality!

There were no hiccups with the recovery; Firefox, Limewire, Azureus, Gaim, Opera, Mercora, Adobe suite, Macromedia suite and others worked as they used to. Even all the plugins that I installed the last time were there.

So the BIG lesson here is, try and back-up the D&S folder at any cost. Even take periodic backups of it to a hdd or a cd/dvd. Do this even if you are taking an application-provided export feature. Also remember that not all applications store their settings in the D&S folder. Among other things to take back-up of, the ini files in the directory where you installed the application is a good idea too. I did so with my 3D studio max and now it has the same UI and other settings.
(As of my writing, I had already done another reformat and reinstall. Details in next blog post.)


Wednesday, November 02, 2005

The input animal.

My usually white Microsoft Internet keyboard was looking a bit dark, being toned-down by layers of dust and oil (courtesy, my hyper-active sebaceous glands). Since I've been looking forward to rid my keyboard of 'keyboard dandruff' for sometime anyway, I got a screwdriver, some soap water in a mug and some shaggedy old rags for the clean up. After painstakingly removing all the keys, I washed them individually in the soap water, put them out for drying in the shade and then wiped each key seperately with a clean cloth. Try doing it sometime and you will be surprised how much pain can accumulate in your fingers in one day!

I gave my 'input animal', aka my Microsoft Wheel Mouse Optical, a quick sponge bath too. But, alas, my keyboard had drowned in the sudden deluge of my loving care, while my corded pet survived. The keyboard had lost sense under the space key and the up arrow key. I tried to go on with these handicaps at hand for sometime. But, as is usual in a murphy-law-abiding universe, there suddenly seemed to be too many things that involved the use of precisely those keys that went kaput. So, interring the fallen keyboard in its original box, rather unceremoniously, I embarked on a journey that would help me buy a new and better kbd.

My mouse, though surviving my rigorous clean-up unscathed, was never in an awe-inspiring state. The middle scroll wheel, feeling that it hasn't been scratched for sometime, decides to scroll down one notch, by itself. It's especially irritating when you are playing Quake III arena, and you find yourself with a gauntlet when you were sure that you were holding a rocket launcher just a while ago. Or, when trying to follow a confusing IEEE paper, you suddenly don't have a clue of what the sentence you were reading means (quite understandable?), just to realise that the mouse had taken the liberty of scrolling down a few lines. So I planned on making it a double purchase of keyboard and mouse.


The mouse hunt:
I always wanted one of them mouses that had plenty of buttons on the sides. They looked cool. And I could do with a mouse with higher resolution as well, as I can cover more screen real-estate with as little wrist movement, and hence fatigue, as possible. For the keyboard, I wanted one with more accessible special shortcut keys than the current one. My desktop (like in wooden furniture) has a sliding pad for keeping the keyboard and I don't always extend it to the fullest. So the top-row special shortcut keys were seldom used. I also want the keyboard to be comfortable for 24/7 coding and gaming. So a palm-rest is compulsory. Initially I wanted a white keyboard as I am quite a nocturnal (read: 24/7) computer user. But, the model I came to like (more for it's cost-effectiveness than the drool-factor) looked not quite 'neat' in white. So I embraced the black one instead.

Next, there's the issue of chorded and non-chorded 'mouse devices'. (courtesy microsoft support. I didn't know if it was mice or mouses.). I've never tried a cordless mouse. I didn't know if they would be practical, as in
- battery life (how long before I have to recharge),
- range (I hear bluetooth ones can have a large enough range),
- orientation(u know, without the tail that I was used to),
- latency (I've heard ppl in forums talking about a noticeable lag in high speed fps games)
- etc (?)

So, after much online and offline window shopping, I bought an iBall Laser mouse. Like the one below:



Pretty neat, IMHO. And my HOs are freakin' damn good :p

Being a laser mouse, it works on the principle of 'Laser Interferometry'. Based on what I can put together, a laser mouse basically splits a single beam of laser and bounces one beam on the mouse-pad surface and the other onto a standard reflector assembly. When the beams are recombined, they form an interference pattern. An interference pattern is formed because of constructive and destructive interference of light waves. This pattern changes based on the path length of one beam. No prizes for guessing that it is the beam that went out of the mouse. Depending on the laser's wavelength and the quality of the sensor, it seems that you can monitor movements as small as few nanometers! Imagine the cursor moving accross the screen just because a clump of atoms moved under the mouse!(yeah, I am exaggerating.) In any case, the interferometer used in our 'mouse devices' are dirt cheap ones [link], so no worry.

According to the iballonline website, this mouse features:
  • a 1600cpi (counts per inch) laser which, I am told, is 30 times better than my old optical mouse. This helps very much if u have a large screen resolution and frequently run out of mouse pad space to move that cursor accross the screen.
  • 6700 frames/sec scan rate gives you great control over the mouse cursor's position. This is especially important in intense online FPS gaming sessions, where it's virtually a matter of your mouse's speed that determines whether you ar your enemy gets headshot.
  • Upto 20 Gs of acceleration recognized. This too aids in moving that cursor where you want.
  • Laser interferometry's resolution is so high that the mouse can see bumps where you cannot. This means that you can use even your ordinary opaque mirror or glass sheet as mouse pad!
  • The mouse also features 6 buttons: 3 normal buttons, 2 butons on the side for 'forward' and 'backward' navigation (in browsers, itunes, etc) and one MS Office shortcuts button. This last button displys a circle of shortcuts for a fixed number of tasks. I am not so sure whether OO.org users will be pleased though as I was not able to change the list of office icons displayed. When u click this 6th button inside one of the MS Office apps supported, you will get a different set of shortcuts. There is an utility provided for changing this shortcut list though. Here's a couple of screenshots of the shortcut lists; first is the generic one and the next is the one you will see inside MS Word.
The generic shortcut list Inside MS Word




The mouse, being as radical as it is, needs some getting used to at first:
The first thing to get over is the fact that, your mouse is a *very* twitchy little creature that is over-enthusiastic to go to wherever you want it to. It might make you think of your previous mouse a stuborn mule. But all the lightning response can get to you, as you need to allow some slack for your 'positioning muscle' 's waver. So in desktop modes, you may be better off scaling down the sensitivity a bit. Else you will find yourself overshooting your target. You can max the sensitivity before starting a game.
The second issue is a bit more of a complaint. Even with a very high resolution mouse like this, you will often find that you need to "lift and drop" the mouse to the center of the mouse pad. But with this mouse, the laser's range is a bit too high, which effectively increases the "lift height" during such operations. The manufacturer can amend this situation. I know this because, the logitech lazer mouse devices used to suffer from this very same issue. Now they have the issue no more.
The mouse cost me Rs.725 after a quick haggle from Rs.780. But a forum friend told me that his friend had bought it for Rs.600 :( I am no good at bargainig.....barganing....whatever.

Keying the (i)tunes:
Coming to my new keyboard, there isn't much to say. Not that it isn't good. I love my new keyboard, but it isn't exactly new technology. Here's my keyboard's profile. And here's a photo of it which I was able to google up:
I really like the multimedia buttons on it as I always run iTunes in the background. It used to be tough, for example, to move to the next song if I wasn't in the mood for the current song, or if I wanted to pause the song when I answer the phone or door. The volume control is also god-sent as my speaker unit usually involves a bit of a stretch to reach.Now I can just minimize iTunes to system tray and be working in Visual Studio or in 3D Studio Max and still be able to control iTunes as if it were on the foreground! It seems impossible to think that I lived without these facilities for so long. But I have to mention that, sometimes, the keyboard seems to be unable to find the iTunes window to send the api calls to. But after I bring the itunes window to the front once, the issue dissolves.

There are a variety of other shortcut keys, as you can see, and they are totally configurable. Absolute control indeed. The function keys are actually 'dual-function' keys (heh). By default, the function keys behave as shortcuts for various word processing and email related tasks like undo, redo, new, open, close, reply, forward, send, spell, save and print. To turn this upside down world the right side up, press the F-Lock toggle key and you get back your old Function key functionality. Interestingly, the Scroll-Lock LED is missing. In its place is an LED to indicate to you the state of the F-Lock key. Who cares? I can truthfully say that I never used scroll-lock, except to check if the LED glowed, perhaps.


My next keyboard?
Assuming this keyboard lasts for more than a year, I would like to buy the new Microsoft keyboard with built-in zoom slider and other cool features. But I would not say no to this drool-inducing piece of gadget, the Optimus, too. Take a look:

Drab? I see that you under-estimate the optimus. You see, the Optimus has an OLED key surface. It means that, not only can you change a key's function via software, you can actually change the key's display (color, font, image, animated icon) too! Now, it's no longer drab, is it? And, you've got to love that extra block of colorful keys.

Here's the optimus in Quake 3 mode :p~
The optimus in Quake 3 setting.
Here's a look at the colorful block:

The colorful, extra set of keys.




For the money-minded:
Microsoft Multimedia keyboard = Rs.1050
iBall
Laser Precise mouse = Rs.725

Monday, October 31, 2005

Why ladies need us men? ;)

The idea of Evolution as a more logical answer to how we came to be, was very interesting to me. It was so interesting that it acted as a catalyst for my conversion to Atheism. But there were more than a couple of questions that the antagonists of evolution asked for which I wasn't able to find the answer to. One of the most interesting one as well as the one that stumped me for a long time was this:

"Why the male, female schism in a species?"

After watching the fantastic educational video called "Evolution" by PBS (of particular interest is "why Sex?"), which was followed by a small research via google, I was just too excited to not to write this blog post. It seems that Darwin's theory of evolution answers the question of 'dimorphism' in a species. To understand the question, and hence the answer, we need to know more about reproduction. Reproduction is merely a side-effect of the process of mating. Reproduction can be brought about in different manners. We are, of course, familiar with the sexual process. But the process that we are interested in is the bizzare and fascinating subject of 'Asexual reproduction' [link].

Asexual reproduction involves the genesis of a progeny from the cell of just one parent. Although this method of reproduction is more common among plants, it it not quite popular among animals. Less so among mammals. Its nature's way of cloning. Most plants can be propagated asexually by grafting or replanting a cutting of its stem or root. Asexual animals that I am familiar are mexican pond guppies and mexican whiptail lizards. These are bizzare creatures which propagate by Parthenogenesis, which is a method of cloning where the female produces an egg which matures without fertilization. This species lizards all look alike because, well, they are all clones.
So why then didn't humans and other animals follow this particular path to their ancestry's immortality? No males running behind the females and trying to impress them with lame antics. Girls reading this may already be wondering, why-oh-why didn't it turn out like that? There are some more reasons why females might want to wish-away males. Sexual process of reproduction can be more trouble than you might recognize:
  • Engaging in mating rituals is tiring for the female. They have to choose a male who has worthy enough characteristics that would allow him (and hence the offspring) to survive and, of course, there is the actual act of mating.
  • On the other hand, female Parthenogenesis (virgin birth) is very much possible. There even need not be those crazy and violent males running around, endangering your child's life.

Not only the females, come to think of it, but the males too have it hard :
  • A male in search of female should grow ridiculously large appendages or bright plummages or learn to build beautiful nests or sing loudly. All these activities and/or growth increases the males' risk of getting spotted by predators.
  • Competition among males competing for a small number of females can lead to embarassing defeats, debilitating wounds or, sometimes, fatal wounds.

Even as a species, they stand to gain more by following Parthenogenesis. A single parthenogenetic female can colonize a new area, whereas atleast one male and one female of the same species are required otherwise.


So, what is the big prize that awaits those species who like to endure all the pains of being a heterogenous one?

New possiblities. Adaptation to changes in environment, resistance to new strains of parasites, escape from newly introduced predators and the ability to exploit new varieties of food resources are the prize. The 'dimorphic' model of species assures the continuation of lineage with the help of "genetic recombination" or the shuffling of the genes, each with random mutations giving the "gene-holder" an aegis against a threat. This recombination might hold the previous gene "mistakes" and add another reshuffling "mistake". A mutation, which is survival's coveted secret behind its grand experiment of Evolution.


These are just a few of the observations that my computer science mind could comprehend. I might even have got some of these things wrong. But hopefully not.
Here's a fantastic link to a book extract that I 'tried' to read through: Male, Female: The Evolution of Human Sex Differences

Wednesday, October 26, 2005

Overclocker Anonymous

"Hi, I am Anand kumar. I am an overclocker."

With the recent purchase of a Winchester core AMD64 3200+, I was impatient to learn the patient art of OverClocking your PC to the limit. After desperate brute force attacks on the various clock-related options via an utility, I was able to push the clock speed past the default clock to 2220MHz. This I acheieved by just increasing the core clock from 200 to 220MHz (with x10 multiplier). Other combinations didn't work, which, as I later found out, was because of the in-windows overclocking utility (Asus AI Booster) that kept crashing windows with clocks anything above 220MHz.

But later, after I got over the initial laziness to go through a few overclocking guides in a few forums, I finally acheived a much more respectable OC. Here'a a screen shot I made juat after a 2Million digits in SuperPI. For some reason, the SuperPI font is illegible, but the messagebox says that it went alright.




Before I plough on, here's the basics of OC'ing for the uninitialized:

- First, you need to get a motherboard and processor that allows you to OC. For example, in processors, an AMD Venice core OCs better than a winchester core, and in motherboards, an Asus A8NE OCs better than other motherboards while a DFI Lan Party motherboard is the holy grail of overclockers.

- Next, as I found out, get ram modules that has tight (low) timings and, possibly, with heat spreaders. For example, my cheapo Hynix ram running at 3-8-4-4 @ 200MHz can't stand a lot in the way of fsb clock whereas a Corsair/OCZ running at 2-5-2-2 has a lot of leeway.
- Having a clean and stable power supply unit also helps. Having a smps that has little in terms of fluctuations will help keep a stable processor at marginal voltages.

- A good, well ventillated cabinet can help keep the cool in ur processor. This is because, as u increase the clock speed of the components, they will require extra power for the signal to have the same impact when it was at a lower frequency. But cramming more power means more stray power that gets converted into unwanted heat. Most processors will operate without inexplicable auto-restarts and bewildering BSODs below 65degrees. The cooler you can keep it, the better.

After you have taken care of the preparatory phase, just increase the htt/core clock in steps of 5MHz and keep going till u hit a spot where the computer just plain refuses to POST. Now, if your board is overclocker friendly, it will let u access the bios settings no matter what, from which place you can revert back by 10MHz. Otherwise, you will have to open up your cabinet and reset the CMOS manually.
Now increase the VCore (cpu voltage) to the next number and try the htt increase. Anyways, 5MHz below unstable is the rule. Now, do the same with your ram by increasing your fsb by 5MHz. Increase VDimm (Ram voltage) if it doesn't post. But for Pete's sake (more like, for your computer's sake) keep an eye on temperatures. Be extra careful with ram voltages as any increase in ram temperatures cannot be noted. This is where the heat spreaders on ram modules can help, somewhat.

You can apply the above technique to your graphics accelerator as well. But 3D cards are especially notorious because they contain two cores: 2D core and 3D core. So when u OC in the 2D mode, the temps may be cool, but when you start that game, your GPU may BSOD or restart the core. The temp problem is very elusive too. That is, if u wanted to check the temp of the GPU while running a game,
by the time you ALT+Tabbed to the desktop to check the temp proggy's display, the temperature would've gone down by 10 to 20 deg C! They cool down extremely quickly. Most highend 3D cards operate above 90 deg Cand may reach 105 on a hot day. So be careful when u OC them. Testing each 5MHz with a benching utility like 3D Mark and checking for artifacts like snowing, missing textures and any out of the ordinary display, will help keep the cool in your card.
My NVIDIA GeForce 6600GT (Leadtek) was only able to do 550MHz core and 1100MHz memory. It could do more, but I kept this because the core and memory freqencies are in sync this way.

Now, you can either sit on your laurels here, or u can proceed from here. If you do want to proceed from here, you could try fine-tuning the cpu and memory clocks to acheive a perfect balance. You might also consider fitting your components with water blocks to reduce noise and temperatures. The rule for a fast computer is higher cpu clocks and not memory clocks.

I must've restarted more number of times that day than all the other days combined. But I don't seem to get any faster than 2500MHz. I had increased all the timings and freq to the max, but still the comp won't boot even at 2550MHz.
When I plan to increase the memory capacity of my system, I will be sure to buy a higher end brand with better timings, now that I've tasted OCing. I've heard Asus A8NEs having problem with 4 sticks of memory, so I will have to sell the existing sticks anyway and buy 2 sticks of 1GB each.

PS: For those interested in monetary details: (As of 01 June 2005)
My AMD 3200+ => Rs.9,800
Asus A8NE => Rs.7,600
Hynix 512MB => Rs.1,900
GForce 6600GT => Rs.10,000


(2ndSept05, 9:39PM)

Sunday, October 09, 2005

Of Meh, Meep and Bleh.

I am a great fan of bash.org and lately I am coming across a few foreign looking exclamations that seemed to be some kind of ultra-cliche among the IRCers and in the blogosphere. So I set out to find out about them but had difficulty googling them out because all occurences of these words were of their authors using them coversationally casually. So here's a small list with their possible meaning and an usage example in no particular order. Referred to the wiki and this and this site.


Meh:
A reply expressing unenthusiasm. I heard it said first by Lisa in The Simpsons. Then everywhere I looked, there was this word! Here's a discussion on this word's origin.
Usage:
- "Hey kids, do you want to go to the old age home?".
- "Meh."

Gah:
Exclaimation expressing dismay or unfortunateness. Sounds like it's come from the word 'bah'.
Usage:
- "Gah! Curse this lag!"

Blargh:
Exclaimation expressing uber anger, like that of a battle cry. Probably formed by melding Blah and Aaargh. I think it it describes a 'sputtering rage'.
Usage:
- "You are a st00pid n00b!".
- "Blargh! I am going to h4x0r you to death!".

Meep:
An exclaimation of fright, exaltation, anger or just about anything you feel at the time of usage :) But mostly when the utterer is surprised nastily. Sounds to that it was derived from the meek cry of a lamb.
Usage:
- "Muhahaha, all cower at my uber sniping skillz!".
- "Meep!".

Bleh:
An exclaimation used to express utter contempt along with a blatant boredom at doing something.
Usage:
- "Bleh! This work sux0rs!

Bah:
An exclamation of surprise and, especially, disgust. I've also seen it used as an expression of dismissal or rejection of an idea.
Usage:
- "They all so strongly believe that I will fail. Bah! I'll show them!".

D'oh!:
An exclamation said after realizing that you have said or done something stupid. It was invented by the makers of Homer simpson who cut short the syllable in 'duh' to make him say it quicker. Such was the popularity of the show that almost all the dictionaries have included this as a word to mean "annoyed grunt".
Usage:
- "Hey biatch, you've dropped your purse here!".
- "D'oh!".

Boob:
An dumb person. Someone with no brains and does nothing, just like the mammary sense of the word.
Usage:
- "Boob! You almost fubared my plan!"

Angst:
A feeling of self-pity, anxiety and mis-understoodness by others. Mostly referred to teens wallowing in their depression and complaining rather than taking any action.
Usage:
- "Gah, LiveJournal is overflowing with angst filled teenz."

FUBAR:
Also written as FooBar, it stands for Fu*ked/Fouled Up Beyond All Recognition/Repair. Originally a US military term where it was used to describe a mission's status quickly, it's "foo + bar" version has been used in computer field extensively for representing a two functions. This is perhaps the only non-sensical word that has become a metasyntactic variable and eventually to have it's own RFC!
Usage:
- "Heard about that accident? His car and body are fubar!"

Boo-Yah:
An exclamation of whooping when victorious in an argument or an event. I think it is a mutated version of "Oh yeah". Many spelling variations exists for this word: Boo-yea, Boo-ya, Boo-yeah and the like.
Usage:
- "Did u see me score that one? Boo-yah!"

Ping:
A quantum of happiness (?). People who are very 'Ping' are 'Pingful' and can exude ping others intentionally or unintentionally. It's the opposite of Blargh.
Usage:
- "Ping! I am so happy that I can die right now!".

W00t:
An exclamation expressing a whooping. This is supposed to be used by Dungeons and Dragons online RPG game addicts when they receive loot, while chatting, instead of "wow loot!".
Usage:
- "W00t! I won!".

pwned:
An exclamation used to describe someone you just beat in some competition of words or action, but especially in online multiplayer game. It is basically 'Owned' with a deliberate typo. It's origin is definetly of gaming backgrounds, but whether it is from warcraft or quake is unknown.
Usage:
- "pwned!!! Haha, what a n00b!"

Thursday, September 08, 2005

Roaming memory

It somehow and "somewhen"struck me to use the Internet as a storage medium when i first heard of the hop-by-hop translation of an Internet packet. I suppose that I supposed that I would just send my file into the Internet and make it hop around continously between the routers until such time when I needed that data, at which time I would just sorta "call'em home". It seemed simple and great. Later when I learnt of the TTl (Time To Live) fields in IP packets, my dreamed up Internet Memory of sorts seemed more plausible. I thought that if I could somehow fool the routers to accept a large TTL value, I could make the packet hop around for eternity (well, it didn't strike me why the Internet hasn't ground to a halt with too many packets doing infinite rounds). And when I wanted my packet, I would *somehow* make one of the routers in the loop to forward the packet to me on the next hop through it.

Reminiscing this old thought, I now feel that the above idea can work by using your own Routers in the loop, IP Source-Routing and a lot of hackish coding. No need for a TTL hack. But I don't exactly know if it would work yet. For example, the protocols may not allow a looped source routing, or the router may not respect the source routing (the RFC allows for this behaviour). U can avoid loops in routing by maintiaining two routers, with one router routing data to your other one, hence showing no loops in the routing. I haven't tried any of this, ofcourse. I don't have two routers for a start. And then there's the fact that I am not a Hacker. But I would be happy to know if someone had already done it or is planning to do it.

Moving on to the reasons behind how this "Memory" might work, only one word for you: DRAM. That's right, Dynamic RAM's working principle is the closest that U can analogize here. As u may know, in a DRAM, a memory location forgets what data it holds unless an internal circuitry "reminds" it of the data periodically. Forget to refresh it, and u've lost the data stored in there. But the important thing to notice here is that u don't need a second memory stick of the same capacity from which to refresh the DRAM. Also, the forgetfulness of the DRAM chip does not make us shirk it as a Main Memory in our computers. The reason that we even consider it is that it can remember what it is told for some quantum of time.
Comparing the memory to the Internet, I hope that you can see how it was easy to arrive at my conclusion that data can be stored by looping it around a known route. The Internet delay/latency, though being a dark gray pestilent cloud, has the silver lining of allowing the data to be temporarilly "lost" and then recovered back, thus replicating a temporary storage.

I am no great shakes with numbers, but if u take into consideration the following:
>input and output queue lengths and sizes of each box along the route,
>processing delay of each box,
> the Path-Length,
>Path-delay,
>the MTU of the path,
and coupled it with your own bandwidth, U can pretty much guess exactly how much you can store in this "Roaming Memory" (heh, had to mention it *somewhere*) of yours.

You could also be juggling many data packets at the same time, each packet travelling down it's own source-routed path. The number of such concurrent packets that your connection can handle can be easily calculated by knowing the data packet sizes, the amount of overhead associated with TCP/IP, ATM (if any) and anything else. Ofcourse, to have a larger memory, you can send different packets through each memory route, or for more robustness against packet loss(!) along a path, u can have a duplicate memory path for each packet.

Now to think of how actually you would call home a packet. First off, u can send in some ID code in each packet so that we can ask our router to retract in the right packet from the loop. Here comes the hackish coded application that I was talking about. This app will have to coordinate the packet routes, routing policy of the routers, setting a new packet of data in the loopy-motion, asking a router to retract a packet of the loop, putting different IDs on the packets, maintaining an ID-to-what it contains table and everything else I missed.

If u are perplexed as to why I should resort to such a twisted way of storage and why I didn't think of GDrive-like tactic of utilizing the email space, then u do have a point. But, since email follows the store-and-forward method of sending messages, you have a large delay. What my idea provides is a very low latency and high speed but a very small size memory. More like a cache memory in ur processor/mother board. Using GDrive-like idea will give u a very high latency, but large memory. Kinda like the main memory/hard disk in ur computer.

Friday, August 12, 2005

The Spiro collection - Making of



It's in no way a perfect render, but this was the 3d scene that I was referring to in my Zone post. This set was modeled by me which I did so by referring to this real world collection here. The collection consists of a Spiro Bar Stool, a Bistro (italian for pub) Table and a Spiro Tea cart. Found this one after googling for table sets images and loved the simple elegance.

I posted a small tutorial on this in a small forum because, though simple it is, the table seemed to be baffling to one of the members. So, I am posting the tutorial here too, incase someone reading this needed a pointer on how I made this. So here goes.

At first attempt, I struggled a lot not knowing how to create the table. Here's my preliminary attempt:

This picture, which is grossly wrong, was pretty difficult for me and was made as shown in this screenshot:
I first created "guide circles" (those scaled up circles in the side views.). Then in the front view I drew a line with as many vertices as there are guide circles from the bottom circle to the top one. Then in the top view I just moved the vertices of the line so that it conformed to each circle at different angles. It's kinda difficult to explain but I guess u will understand from this. This method was so tedious and error prone that I was struggling with it for more than an hour.

The later, more faithful, version of the table was made this way:

I just created the four inner, curved legs that u can see to set the scale of the table. Then I drew out a Helix on the top view and modified its height in the side view so that it forms a good shell over the four rods (For the helix parameters, see the above screenie). Then, I edited the helix at the bottom side to make it look more like the one in the reference photo. Since in the reference table there are 4 legs with two helix per leg, I rotated and cloned the helix 45deg seven times.
This method only took me 5 minutes to make this table.


Actually, the entire scene is made from lines (well, splines) that were given a renderable thickness value. It would've been difficult to the same with 3d solids.

For the material, I used the default Brushed metal. The rendering environment was set to uffizi_probe.hdr from debevec.org. If u had tried to learn to light a scene, you would've known that lighting is so tuff. So I didn't bother much with it. Just used one sunlight with final gather set to 700. This means you don't get to see sharp Caustics and control over shadows, but this method has, as I found out, the best mix of realism and speed.

For the cloth, I just used the default "carpttan.jpg" as diffuse and bump in an architectural material with template set to 'Fabric'. It is my default way of creating a cloth. For the cushion, I used a chamfered box with meshsmooth and a lot of fillet segments.

Hope it helped someone. (I know this is expecting too much as I don't have a lot of readership, but I've got to keep alive the hope.)

In da Zone.

I always lock myself in my room with my computer and very rarely go to that big room called "Outside", you know, the one with a very bright yellow-white bulb and lotsa concrete structures. (Infact, I must be the only kid in my friend circle who had the honour of his parents asking hopelessly to "go roam the streets"). Though I've started to go to a local Gym for some time now, I usually release my pent up energy by jumping, walking, running in-place in front of the mirror, or, if I am extremely annoyed/exhilarated, punch the walls in my bedroom with my knuckles a couple-a times.

The other day, I had just finished creating a 3D scene which I thought will inevitably join the list of unfinished 3d scenes. So I was euphoric, especially because the render turned out not-so-ugly too. So I was punching the wall a few times listening to iTunes, while I remembered my new gloves distributed in the Gym to aid in lifting barbels. I donned one on my right hand and punched once with my usual reserved force. This was when I made my mistake of thinking that the padding might protect me no matter what the force is. So there was I, punching with all my might into the wall a few times.

But was there pain then? Nope. There was no pain at all. This encouraged me to sock it to the wall a few more times. It's needless to say, I still have terrible pain over my knuckes though it's been a few days now. I don't know if they have cracked or not, but the color of the skin there has changed a bit to the darker side.

But wait, Why didn't I feel anything? Why did the pain wait for sometime to be felt? What was going on in my mind?

Here's another such experience in my life. When I was in sixth or seventh (I really don't remember) standard, I had joined a local karate class. I was attending this class for a year, I guess, and obtained an Orange belt. Shortly before my Master decided to start a new class somewhere else, he made us participate in a karate tournament. It was an awesome experience, needless to say. But I was throughly distraught at the possiblity of facing a well placed round-house kick from some of the l33t black belts there. I remember that in the first round, I sparred with an equally matched opponent. But I bet him by some points. This boosted my confidence to high levels.

In the second round, I was again sparring with someone of equal stature to me. Punches were exchanged and kicks were dealt. I was concentrating so much on landing left punches (of which I was famous for in my karate class) on my opponent that I had barely noticed any other thing. I heard no sound. I saw no faces peering from all around us. I saw no referee(or whatever he is called) moving close by us.I had no realization of whether I am within the sparring ring line that I was not supposed to cross. There was no nagging thought about home, doubt of losing, how bright the light is, what people are thinking about me right now, does my stature make me look awkward or not, etc. I was doing the only thing that I was supposed to be doing: defeat the opponent.

Then came the whistle.

This shook me up. Suddenly, as though someone has turned the master volume of the world up, the loudspeaker from the results of the sparring ring next to me, the cries of encouragement and suggestions to land a round-house kick from the eager crowd, the call of the referee towards him, plastic bags and covers crinkling as their owners took out snacks from them and everything else could suddenly be heard. My eyes too seemed to have suddenly recognized the world as if for the first time since I got there. The referee had called us because one of us had stepped out of the ring (i don't remember which one of us, clearly).

But the damage has been done. I feared that state of nothingness and actually felt a bit stupid. This loosened me up considerably and I lost to my opponent.


What is this state of automatic response by the body and mind to the present situation? I didn't know it then, but this is when psychologists say that you are "In the Zone".

What is this 'zone' thing? (And where are you when you are not in the zone?)

It is the state where your are performing automatically, kinda pre-programmed. Mostly, soldiers and athletes NEED to be in the zone. To give you a glimpse of what athletes and sports people experience when they are at their peak, take the everyday activity of brushing your teeth. For almost as long as you can remember, you have been brushing your teeth in the morning (and night, if you are unlike me). Now, can you think of someday when you have been on your way to your work and suddenly wonder whether you had brushed your teeth or not? Then there you go, you've been in the zone, albeit a smaller one than what hte atheletes occupy.

So being in the zone means that your body and mind are performing at their peak. You may think that it's certainly a good thing, and you won't be wrong. But how to acheive this state of harmony? Certainly not in an easy way.

Since your mind and body are involved, getting in the zone needs you to train them together. For the body, as hinted in the above example involving brushing, practising to the point of automation is the only way in. For the mind, you need to practice zen meditation. Don't worry, zen meditation, according to my Max Excel guide atleast, "is the easiest to do, although none of us practice". Zen means to "live in the present". In other smaller key word terms; concentration, feeling challengd, confident, being positive and calm.

How you acheive all those at the same time, is through another keyword: dedication. I guess it will be clear now why athletes seem like a well oiled machine at their activities. Writing an exam, computer gameing, driving a car, eating food (i guess), etc are some of the automatically programmed sequence of movements that we access, much like calling a pre-made function that can be called by the program. This is definetly faster and efficient than when it has to figure out a new way of doing it.
Perhaps your life was littered with such events and was bewildered like I was. Now, I guess, you know what it means.

(originally posted: Aug09)

Wednesday, August 10, 2005

A simple matter of Evolution.


Someone sent me this above picture among others for me to get a few laughs. Unfortunately, I couldn't decisively conclude on the author's message. So I let my right-brain run amuck and came up with a few good candidates for the author's intention. Here are some that I can come up with:

I guess this one's talking about stature of the depicted person (coz I don't believe that everyone's using computer in that particular way) which has returned back to the one characteristic of the primate to the extreme left.

Another way of interpreting this would be that the mobility of human beings is on the decline. (I can imagine an AD with the above picture and another one below it depicting the latest human using a palmtop standing straight and walking around as well, as a promotion for palmtops.)

Or maybe the picture is depicting a world where humans evolved but chose not to wear anything, even after reaching the info age, and the words below as saying that something on earth went wrong because we on earth wear clothes. (Maybe an AD for a Nudist Club?)

Or, maybe the author is all for a hairy guy and wonders what went wrong as the humans got lesser and lesser hair coat than the chimps.

Or, maybe it is talking about the too abrupt shift from Hard tools (spear, jackhammer, etc) to soft tools like the computer, and the comment can be taken to be wondering what made the sudden shift.


Perhaps the real meaning is not for me to know :)
I guess I have too much free time.

Friday, August 05, 2005

How would you like your probablities? Relative or absolute?

After my previous post on quantum theory idea, I went through a lot of emotions: 'uncertain', 'kinda stupid', 'I-shouldn't-have-posted-this', sleepy, and then 'hmmm-there-could-be-something-in-this..' and then 'whatever'.
But I didn't know if whether I am correct or not. So take this with a grain of salt (and pepper too, really, it's ur call).

It's a very simple and small thing, which is why ofcourse I am building up towards it with all these fluff. Ok, imagine someone throwing a ball straight up. Let's say that this someone is doing the said innocuous activity in the USA, so that it is easy to imagine when I say that he is being spyed upon by a camera mounted on a street-side pole. This camera observes the ball travelling at 10km/hr.
Let's say an Alien spaceship that was passing near the earth had suddenly stopped because their fuel had run out (poor them) and they were watching the earth with Acme Super Magnifier at this now almost-popular someone in the USA, at this exact same moment.
The aliens will notice that the ball was thrown at almost 107,005 km/hr, much to their surprise, by a puny mud-man.

Ofcourse, their surprise wouldn't have lasted a few milliseconds because they would've worked out the revolution speed of earth on which the mud-man was enjoying a ride himself (which is, ofcourse, 107K km/hr).

Now the moral of the story is that we have to add the speed of the "platform" or frame of reference to get the absolute speed. But getting the absolute speed here was possible only because I take it that that's the absolute speed of the earth. I don't know if considering the possiblity of bubble universe theories, which would've meant that the universe itself might be translating in it's mother universe, would've changed the speed of every object.
(Of the Bubble universe theories that I know of, one says that universes are made in clusters, much like bubbles in a foam. Another theory says that bubbles of universes form inside a much bigger universe to explain the uniformity of the present universe ("uniformity of matter in the present universe" is in itself a much bigger and interesting concept). Sounds like some pot-smoking-60's-hippy created theory? Tell me 'bout it.

Now, if u have read my this post over here, which I somehow doubt that any soul other than me did, you could've gotten a fleeting glimpse of what I am arriving at. Basically, I thought that if parallel universes suggested by quantum theory is how I imagine it to be, then you can imagine the universe you (THIS you) exist is one of the (temporal)leaves of an immense tree that branches out whenever a chance exist in that universe.

Now to tie "relativity" and quantum physics together. Imagine our world is shrunken down to sub-atomic level and that u are banging your head on the walls of your sub-atomic house. Now according to quantum tunnelling, even though you don't have enough energy to smash the walls with your head and enter your house, there is a finite probablity that you will eventually do go through the wall without breaking the wall and head (Kinda like "no clip" cheat code in games. Get it? I don't either). This happens only when your energy level and the wall's has a small difference. But the probablity is very small. Let's imagine, for our sake, that the probablity is 0.1, so as to avoid writing too many zeroes. This is what I am calling "relative probablity", if there can be such a thing.
Relative to what? The big tree of probablity that I was talking about, ofcourse. So what is the probablity that we are looking for in this tree? It is the probablity of THIS universe being possible. Add to this probablity the 0.1 that we assumed, and you will get what I call "absolute probablity" (again, if there can be such a thing) for you to quantum-tunnel yourself into your house in our little thought experiment without smashing anything.

Rest assured, that even if what I supposed was correct, this probablity should be so small that it can be ignored completely and still things will work the same as they have did. After all, when u are on earth, there is little difference if the earth is moving at 30km/sec or at 30.1km/sec (except for projectile calculations, ofcourse).

[Edit: Oops, I've rushed it. The way to calculate the current unverse's probablity of existence should be using "Particular probablity", not direct addition.]


Referred this wonderful lecture for earth rotation and its effects. Read the other lectures in that site too.
I also recommend u read this simple explaination of quantum tunnelling. Read the wiki on this too.
For bubble universes, I could point you to this explaination on the web, but I suggest reading a good simple book on the subject. I am reading Time Travel in Einstein's Universe and Stephen hawking's books on universe and stuff.

Tuesday, June 28, 2005

Consoling the PC.

Given a choice between PC and a console, I would definitely go for the PC (first). I can do all sorts of things with a PC than I could with a console, I would say. But I may say something different with the XBox 2 and PS3 in the horizon.

I won't go into the spec details of them black and white boxes. It's old news. To convert to the console religion would, of course, require there be support for all of the applications I run and will buy in the future. I am not sure how good the Linux in the xbox1 ran, but that's the kind of thing that will move cell processors into the PC world, I think. But I don't see myself converting, so why bother.

Recently, I did convert to another's camp following my recent purchase of an AMD based system. The base I left was Intel and the target base is, ofcoure, AMD. Here's an article that would convert a few of the intelites.
But CPU fanboyisms apart, one must choose one's processor according to one's need. The Big question in any bewildered CPU shopper's mind is this: AMD or Intel?

After more than a week of continuous research on AMD and Intel CPU and motherboard reviews and forum posts, I found out what to choose. It is simple: If you want your computer to render 3D scenes, encode videos and compile code faster, then go for Intel. Rather, if you want to do number crunching work and play games, go for AMD. I had decided to go for Intel, obviously. But there are catches abound ofcourse. One main thing involves Bill. AMD costs a lot lesser than Intel's similarly rated processors. So I had to change my decision in favour of AMD after a seriously non-technical point broke Intel.

Coming back to the topic, I was kind a amused to look at what's happenning with consoles and PC. I mean, the first gaming consoles were made out of Atari's and other architectures. Later on, PCs were being used for general purpose computing. PCs were more powerful than consoles. Later, PS and XBOX adopted the PC architecture. But consoles were using triflingly (but adequately) powerful PC components. But now consoles are using cell processors which are much more powerful than PC. Now if Matrix has taught us anything, the counter-part (PC) must gain as much power as consoles have :) So will we see the PC migrating to cell processors to maintain the "Equilibrium"? Will we see the PC do a console and overtake it in terms of perf?

I think the guys at Intel can be trusted to hold good their precious Moore's law, atleast out of desperation.

Saturday, May 21, 2005

To Smoke or to Smoke

Nah, I am not missing a "not" in the title of the post. This post is about the two huge ideas that went from smoke to smoke. Here's what happened:

After quite a bit of ramblin' over the topic for THE project that we are supposed to do at the end of under-graduate degree, we finally had come at a fork in the road.

The "we" was made of THE TWO guys of IT department in my college plus me. Sure, you can easily run a word-count on the words I spoke at college and still not hit anything more than a few thousands, but it seems whatever little I said have given me some recognition. The other two were vociferous when it comes to voicing anything. Sriram krishnan and Balak are the two best brains that you can have around if you are planning a brainstorming session. I was definetely out-of-league here. I take in things bite-by-bite and come up with something...in the end.

Me and balak were fascinated by p2p and network related research stuff, while the microsoft employee sriram wanted to do something that will give "instant-gratification" to the masses. My suggession was that we do something with BitTorrent to alleviate ordinary web servers' bandwidth problems. (Yeah, I like BT). Initially, my focus was on eliminating slashdot effects. Then I wanted a one-code-to-solve'em-all thing. I had planned to write a replacement web server and a replacement webclient (actually, a browser) which would incorporate all of BitTorrent goodness in it.
Then when I actually sat and talked about it, sriram wasn't exactly sure people would like to change their web servers "just because a bunch of undergrads told'em so". So we sat down and refined the idea and pointed it at a slightly different but more appropriate angle. Since not a lot of them do podcasting and videocasting, we may as well provide a distribution media for them. But for a more short-term focus, we chose text blogs. Thus was born "Smoke". Read all about it in sriram's blog posts here.

(There's a running gag that if balak or I were asked "Why name it Smoke?", that we would reply with a "Ask sriram's GF" ;) )

Sriram was more into employing intelligent search engines, retro-fitted with intuitive visualizations and putting that behind a MS-legacy-usablity-for-the-masses UI. But now he wanted to build a machine. A machine so powerful that...ok, it's a virtual machine :)

I was not in favour of doing a virtual machine. Partly because I wasn't sure I could crank out something as big as a virtual machine, and mainly because I wanted a p2p project in my resume. But later, I accepted to be a builder of the virtual machine, partly because I knew sriram had no problems with doing a p2p thing so I shouldn't have one working on a VM, and mainly because neither me or balak was able to come up with a project worthy p2p idea.
One interesting idea I came up with involved "TorrC"s, or TorrentContainers which are just Torrent files containing hash of torrent files. Basically, my idea was to give the main tracker server the abilty to make a tracker out of the peers, dynamically and transparently. I hoped that this would halt the tracker server shutdowns for some time and give the BT sites more robustness and more bandwidth. I didn't really go into the details as this seemed too small a project and was dropped. Recently, I found that the BTHub project (@isohunt.com) is a similar, but more centralized, implementation.

So, thus began the construction of this awesome machine. And it was thus named ....(wait for it).....Smoke! It even has a proper little home at sourceforge.net. Sriram follows the building of the SmokeVM at his end in parts: Here's First, second, third and fourth parts (for now).

Since sriram was the authority on programming languages, virtual machines and stuff like that, we let him split the workload amongst us. Sriram enlisted more help from the outside. Aarthy and kaushik were invited to join the coding work force. Before long, each one of us were assigned our task. Sriram would muscle with the main smokevm engine, while balak and I would produce a python-to-smoke compiler, kaushik would produce the parser that reads-in the smoke code file and Aarthy would script out a Lisp-to-smoke compiler.

Not before a month passed, schedules changed followed by our plans. So, in the end, we wound up having to dump the Lisp compiler and post-poning a few of the ambitious goals for the college project deadline. I wound up coding the Python compiler, while balak wrote the project documentation (for the college version of the code). Though we got it all planned and stuff a full three months early, the entire code and documentation that was submited to the college was done in less than three weeks.

The entire project was filled with a lot of everything. I was helpless as I listened to a LOT of geek noise between the head-geeks (sriram and kaushik) both in chat conferences and in the smokevm mailing-list. I had fun coding something and sending it over to sriram for checking it with the VM, who would check it out immediately and reply. We were initiated to the world of cvs through this project. I had my own moments when I n00b'ily created a few modules on the cvs tree and sriram had to "patiently" delete those. I had great fun constantly updating my compiler code once I finally got the hang of the cvs thingy. I experienced responsiblity when I started creating Change logs for all my coding updates. We all had our own goof-ups in the project and I believe that we learnt a great deal from this.

I remember being baffled at the very thought of coding a compiler for a language that I hardly knew existed (python). But after I wound up learning that language, I was happy to write the compiler in Python itself. In retrospect, it only seems perfect to write it in python because it has excellent library support to examine it's own bytecode output. I had promised a be-all-end-all python disassembly guide after I am done with the python compiler and it will be posted.

Throughout the project, sriram was kind enough to ask me to have fun doing this python stuff. I really was having a good time.



PS:
There was another project we were considering at that time but it never had a definite goal to it. It was for the Microsoft Student Project program 2005. It was basically a contest where you can register your final year project with microsoft and compete among other students from other colleges for the best project. The winner also gets a chance to enter the Imagine cup 2005.
We had a [Search engine + Blog + p2p = Cool!] theory that we wanted to make a project out of. Since I got to register the project, I whipped out the name DEBIAN, a retronym for Distributed Enhanced Blog Itemiser And Navigator. A cool insult to microsoft if the project was awarded the best. I had to endure quite a few words from sriram before he tried to forget what name I gave the project (being a pro-MS guy that he is).