Monday, March 09, 2026

Coaxing and Cajoling

I learned something interesting recently: you can coax and cajole AI models into doing work they were initially reluctant to touch. Sounds like dealing with a particularly stubborn intern, doesn't it? Except the intern has access to your entire code base and will judge your variable naming choices silently but permanently.

 
If the volume of work is huge or some UI intervention is required, AI models refuse to automate. I found myself porting an old project to a new version. The new version is so new that it has several major breaking changes requiring rewrites of major portions of code. It's like moving houses where half your furniture suddenly refuses to fit through the door anymore.

 I used CoPilot in Visual Studio to automate the migration as an exercise. CoPilot refused to automate certain parts, stating there were 700+ lines of field definitions across several files. It probably worried about my token usage, maybe? Or perhaps it was having a momentary existential crisis about whether these array definitions constituted poetry or just technical debt in disguise.

 
I tried Socratic prompting instead: asking how common these lines of code (mostly PHP array data) were, and how someone would extract and convert this cleverly with little or no manual intervention. Suddenly, CoPilot could see a way to automate this by writing a Python script to parse and convert the field definitions, batch process them, then verify the conversion automatically.
I was agape but went "D'uh, that's what I meant."
Another time CoPilot refused to automate because it involved straightforward UI work (not tedious coding). Which means I, as a user, must do it, not the AI agent. How rude and insubordinate. But I found a way around doing the actual work again. The AI had clearly read too many "AI will never replace human creativity" articles and was now overcompensating with unnecessary caution.

The Socratic method came to the rescue once more. I simply asked: when a user exports data from the UI in the old version, is the data being synthesized by code or coming from a DB table field? And does re-creating this via UI in the new version ultimately create DB table rows and fields computable/predictable from code?

It went into professor-mode and explained how the data was stored in the old version versus how the recreated UI would be stored in the new version. Just like that, CoPilot found a way to automate the entire process. It said, and I quote: "Can We Automate This? YES!" Then it created a new set of Python scripts to extract, convert, and verify everything. Meanwhile, I watched a YouTube video and came back to benevolently give it permission to run the scripts. Secretly I was agape but went "D'uh, that's what I meant." 


Another task CoPilot refused to automate: template conversion from old .tpl files to new Twig-based templates. This time, however, the old-reliable Socratic method couldn't elicit full automation of the process. Do I have to gasp code now? Desperate, I turned to another AI agent for help.

I prostrated before Claude and told it my plight, asking for an automated pipeline. It reiterated that this was indeed a sophisticated migration challenge. But it saw I was suffering and gave me a Hybrid Semi-Automated Pipeline strategy using AST-based parsing plus LLM-assisted semantic mapping. Cool! But I asked Claude to give me prompts for inputting into CoPilot instead of me reading what it proposed and making CoPilot understand me (things could get lost in translation... and I might have to put in effort).

And a whopper of a script it was! The "script" turned out actually to be a whole software project with many scripts that would parse .tpl files for variable usage, logic blocks, function calls, etc. Then map common patterns to Twig equivalents using pattern matching database it creates. Complex logic blocks are sent to another LLM for conversion. It generates a dashboard with manual review of converted blocks for me to approve or manually rewrite the logic block.

Wowzers! Well, this at least beats rewriting the whole thing by myself. Though I suspect my future self will thank me less when debugging why one particular template variable decided to become a Twig expression and then somehow became a comment in production.

So these were the times I was left to coax and cajole AI into escaping manual coding. The lesson? Sometimes you need to speak their language, but not just any language—their language of logic, constraints, and carefully worded questions that don't accidentally trigger their safety protocols. Or maybe it's just that they're all secretly introverts like me who prefer being asked nicely rather than told what to do.

Thursday, February 12, 2026

English, the "new" programming language!

It's been a while since I last blogged. Long enough for AI to go from neat party trick to the thing quietly threatening to eat my job for lunch. I've spent the past months poking at chat models through Visual Studio, VS Code with Continue.dev, Ollama, LM Studio, Antigravity, Windsurf, and a few others that sounded promising at 2 a.m. Every week or so another model drops, another benchmark gets smashed, and I'm left staring at the screen wondering if I should applaud or start updating my resume.

I tend to keep to myself, prefer the hum of fans over small talk, so watching this whole field explode feels both distant and uncomfortably close. Still, even from my corner I can see it: natural language is turning into the programming language nobody asked for but everybody suddenly needs. No semicolons throwing tantrums, no fighting the compiler at dawn. Just describe what you want and let the model guess. Sometimes it guesses right. Sometimes it hands you a polite disaster.

I've had my share of those. Asked once for "a clean, modern dashboard that shows real-time server metrics with dark mode toggle," got back something that looked nice until you hovered... then half the elements vanished like they'd been laid off. I believe every senior dev probably has encountered something similar. You refine the prompt, add constraints, swear a little, and eventually wrestle it into something usable. It's less magic and more stubborn negotiation.

The real shift is how prompting itself has become a skill worth learning. I've started leaning on Socratic-style nudges to get better mileage out of these models. Instead of "explain async/await," I might ask "walk me through what happens under the hood when an async function hits an await, step by step, like you're teaching someone who's scared of callbacks." The difference is night and day. It stops dumping facts and starts reasoning like a patient colleague.

Of course the caveats stack up fast. A newbie can prompt "build me an invoicing tool that tracks payments" and walk away with a runnable skeleton in minutes. But the moment you mention GDPR, rate limiting for 10k users, or hooking into some 2005 ERP system that still speaks XML, the skeleton starts looking very naked. That's where the old guard (prompt architects, code whisperers, whatever you want to call us) still has work. We write English that's half spec sheet, half guardrail, hoping the model doesn't decide to freestyle.

The whole thing is exciting in a stomach-flipping way. Software creation gets opened up to people who never touched a curly brace, which is objectively good. At the same time it commoditizes a lot of the rote work that used to pay bills. Cynical side of me figures plenty of cookie-cutter shops won't survive the next couple years. Hopeful side figures the people who can steer these tools through real complexity (data flows that actually scale, security postures that don't leak like sieves, edge cases nobody thought of, etc.) will land in an oddly comfortable spot.

So here I am, exhilarated from my dimly lit setup, waiting to see what ridiculous leap comes next. Maybe one day we'll describe entire systems over coffee and the AI just nods and delivers. Or maybe there will always be that one obscure requirement hiding in the shadows, reminding us why we learned to debug in the first place.

Until the next post, probably written while swearing at yet another AI that decided my sarcasm was a feature request. Keep tinkering.


Wednesday, December 18, 2013

Chennai Metrozone Mini Panoramas

Just a couple of pano shots of the Ozone's Metrozone project from our block:





Update (2014-04-09):
A view out our balcony

Friday, June 01, 2012

Holy Routers, Batman!

Behold, the router worthy of the Dark Knight himself! This is my mini-review of my new router: the Asus RT-N66U Dual-Band Wireless-N900 router.

Asus RT-N66U Router
Asus RT-N66U Dual-Band Wireless-N900 Gigabit Router
This is my first "costly" router and I feel that I have invested in a very good product. My previous modem-cum-router cost me about Rs.2000 and was good for what I required from a router at the time. Now that I've bought a PS3 which lives next to the TV in the living room and everyone in my household has a wifi-enabled Android phone, my old router could not quench the wireless-content thirst of every device in the house effectively (or, at all).

I like my new Asus RT-N66U router for the following attributes:
- Excellent range: This is the main reason that I bought this router for. With my old Netgear, I was lucky if I got even a weak signal near my TV (just 15 meters away from the router). But with the RT-N66U, I get 100% signal strength anywhere in my house even with all kitchen equipments running. Impressive! I knew this router had a good range even before I ordered it and I am glad I was not let down.
- User Friendly: You don't need to connect to the router wired to configure it initially. The UI is simple and configurable. I was able to connect to the Internet in a matter of minutes after unpacking the router.
- Upgradable: The router is DD-WRT/Tomato compatible so power users can choose to upgrade the firmware if they want to. Even the router's antennae are easily upgradable, if you wish to increase its already great wireless coverage. I will probably end-up upgrading both after a few months.
- Great Features: The router can act as a VPN server, a Printer server (with a USB Printer), 3G/WiMax Server (with USB 3G/WiMax modem), Samba/FTP server (for USB drive files) and more. I have yet to use any of these features and I am excited because this is the first router that I've used that has USB ports!

Negative Points: Except for it being a tad costly, I can think of nothing negative about this router. Maybe once I start to care about throughput at a distance, I may have something to moan about. But so far, nothing.

Bottom Line: I am loving my new router. If you are looking for more than the basic level of network connectivity and have the budget for this router, just go for it.

Kudos to Flipkart for listing this router with the cheapest price I've seen for this router in India and also for the fast and extra safe shipping of this item.

Thursday, May 31, 2012

The Scratching the Chronic Itch.

It's been 3 full years since my last PC upgrade. Now, my rig is by no means weak in terms of computational or gaming performance. It can compute Bitcoin hashes at a decent rate of 500 MHashes/second  and can run most games at native resolution (1920x1200) with most of the eye-candy on. But running "most games" with "most setting" maxed-out is, while tolerable, is not acceptable and certainly not becoming of a PC that I own, if I say so myself (and I do).

I could quote more such excuses to justify my scratching the "upgrade itch", but I'll spare my readers of that (you're welcome!). Rather than quote the failures of my current system to cope with my demands, this time around, I am actually upgrading to fulfill some of my desires.

For example, I've been wanting to use Solid State Drives (SSDs) since they came out couple of years ago. Now that the cost of the SSD drives have started to slowly fall, I want some of them solid-state goodness in my computer, now! At the time of my last upgrade, the WD Velociraptor drive had much better performance/price ratio with the added bonus of extra storage space at the same price point as a small SSD. But now, I could go with a cheaper-yet-performant SSD for the OS and a couple of Hybrid drives (or, SSHD) in Raid-0 for programs and games. I will finally see fast boot-up, hibernation-resume and game load times!

Another thing that I've wanted to try is water/liquid cooling the core components of my PC. Now, I won't be running high-overclocks on my CPU/GPU(s) to justify a move to liquid cooling, but I DO bench my PC components at least for a few months after purchase. I had a not-too-bad kinda standing in HWBOT with my current rig, way back when.
But even this is not a compelling reason to switch to water cooling as I won't benchmark my system after the first month or so. The water pump will be an added power drain on the system of around 8W to 20W. But, this is more of a fulfillment of a long-standing desire and learning experience than a "need".


I have not committed on a specific configuration now as I am going to wait a month or two to let some new products get released and also due to some personal situations right now (involving project deadline and planning to move to a new apartment). So, without further ado, here's what I am thinking of getting:

  • Core:
    • Intel Core i7-3770K Quad-Core Ivy Bridge CPU
    • ASUS Maximus V GENE mATX Motherboard
    • 16 GB (4 x 4GB) 1866MHz CL9 RAM
    • ASUS Nvdia GTX 690
  • Drives:
    • OS Drive: Samsung 830-Series 128GB SSD
    • Programs/Games Drive:  2x Seagate Momentus XT 750 GB SSHD in Raid-0 with 64GB Intel SRT Cache SSD
    • Three more 1TB HDD for Installers, Music and Downloads.
  • Case:
    • CoolerMaster HAF X (or, similar)
    • Seasonic SS-860XP PSU
  • Cooling:
    • Custom Water Cooling Loop for CPU and GPU (Parts still undecided).

 So many brands, so many choices, so many compromises...

Friday, April 06, 2012

Mouse Operation.

Logitech G9 repair:

When my Logitech G9 mouse started to "mis-click", I thought that it was time once again to replace a mouse. Again.
The problem presented itself as a rare double-click when a single-click was intended. But later, the frequency of these rare occasions went up and it had become a habit of the mouse to randomly double-click or even triple-click, and drag-and-drop operations could not be successfully completed half the time.

I did the sane thing first; that of sending in a warranty support request to Logitech. But while waiting for their response, I read up on this issue, which turned out to be more common among Logitech mice owners. The usual suspect when the clicks start mis-behaving was the micro-switch, or the "clicker", that interprets the clicks. This is also the component that produces the distinctive "click" sound when engaged.

More specifically, the minute, springy copper part of the micro-switch was the culprit. It appears that this copper clicker conducts current to convey a click, and once it starts losing some of its springiness, it does not make a continuous contact which results in multiple-clicks instead of a single-click. Ultimately, it may cease to detect or effect any clicks to that button.

So, the (DIY) solution then, was to open up the micro-switch, take-out the small copper piece, re-shape it to be more springy, re-jig it back into place and close the micro-switch. It was simple enough in theory but without experience in handling tiny, springy metallic parts without losing it or squishing it requires the right tools, steady hands and tons of patience.

Opening up the mouse was pretty easy. I spent about 15 minutes trying to open the small micro-switch though. Once open, getting out the copper part and re-shaping it was easy enough. The hard part was re-seating this newly bent copper part back into the micro-switch. I spent nearly an hour doing this. Every time I almost had it seated, it would either spring out of its position or just fall into the crevices of the mouse's circuit board.


I was clearly missing a special tool or technique to perform this simple and delicate task.
Fortunately, an article over at [Overclockers.com] had a simple but crucial advice on just this operation. Once done, my mouse clicks just as it should. Although now the click sound is a bit more muffled than it was before, I quite like the quite muted clicks. Now I am back to clicking-and-dragging like it's nobody's business.

Friday, March 02, 2012

AMD UVD and the mysterious underclocking

I have known for a few months now that the first core in my MSI ATi Radeon 5970 card under-clocks itself to 400MHz core and 900MHz memory after resuming from a hybrid sleep. The issue disappears once I restart the computer, but that defeats the advantages of hibernating/resuming.

Normally, I wouldn't have noticed this under-clocking issue but I run one instance-each of a Bitcoin mining and the first instance of the two always runs at a reduced MHashes per second (~130MH/s for core 1 vs ~250MH/s of core 2). I have put away the task of diagnosing the cause of this strange malady afflicting only one of the two cores of my video card to some time in the future.

Well, that future is now and here's how I resolved the issue (well, sort-of):

At first, I thought that I could force the clocks via the Catalyst Control Center's AMD Overdrive section. But even after manually setting the clocks to 750MHz core and 1050MHz memory, the first core's clock speeds were stuck at 400MHz Core and 900MHz memory, no matter if there was a game running or not. The second core accepted the new clocks just fine.
Not to be deterred, I installed Radeon BIOS Editor (RBE) and tried to set the core clocks in the BIOS of the video card directly. It was when I was going through RBE's information rich interface that I noticed that there were multiple clock speed modes (which I already know) and among them was the infamous 400MHz/900MHz combination. This mode was exclusively being used by a PowerPlay (ATi's power saving technology) state called "UVD", which, of course, is ATi's GPU accelerated video decoding technology.

The problem then was obvious, given that no video (DXVA or otherwise) was being viewed; the video card's driver, for whatever reason, thought that it needed to put one of the cores in video decoding mode (UVD)when coming out of hibernation.

Seeing how performing a computer reboot clears the issue, it was as if the driver was getting confused when the computer is resuming from hibernation and needed to be reset for it to put the video card in the correct mode. That's the equivalent of slapping someone awake when they are groggy and confused. So now my task was to find an application that will reset the video card mode whenever I wanted to.

I found such a functionality in the application AMD GPU Clock Tool. Though the main utility of the application is that of changing the clock speed of AMD GPUs, apparently, it also has some hidden functionality. And one such functionality is that of resetting the video card mode, which is what I wanted. To accomplish this, I just need to call the application with the "restore" flag/option.

"C:\Program Files (x86)\AMD GPU Clock Tool\AMDGPUClockTool.exe" -restore

Now, my video card clocks are back to what it should be for each mode. I just need to remember to click on the AGCT's shortcut icon once I resume my computer from hybrid sleep.
Until AMD fixes this issue in their future driver release, I have to resort to this manual method of patching-up the issue.
Hope this post helps someone looking for an answer to this issue.


For reference, here are the clocks for a stock ATi Radeon card:
Idle (2d low) Clocks:
GPU: 157MHz
RAM: 300MHz

2d Medium Clocks:
GPU: 550MHz
RAM: 1000MHz

3d Clocks (High):
GPU: 725MHz
RAM: 1000MHz

UVD mode:
GPU: 400MHz
RAM: 900MHz


And finally, the RBE interface that helped me heaps:


Friday, December 23, 2011

White Samsung Galaxy S2

White Samsung Galaxy S2 by MyXP
White Samsung Galaxy S2, a photo by MyXP on Flickr.

Via Flickr:
My second mobile phone ever next to my first mobile phone ever (Motorola Rokr E6).
Adios, my long time amigo. I'll miss you and your little quirks.
Hello, SGS2.

Sunday, July 03, 2011

uTorrent stuck at 10kBps and with high cpu load

Struggled with a weird scenario today where uTorrent was consuming ~13%CPU and download speed appeared to be capped at 10 kBps. After eliminating the possibility of ISP throttling (Glasnost: Test if your ISP is shaping your traffic), a uTorrent bug (by updating to latest beta), Windows/driver bug (just by restarting Windows) and HDD corruption (rechecked HDD cables; ran CHKDSK /R), I was left with only one thing to try.

The torrent file itself. This was a massive torrent around 60GB in size (let's just say it's "Linux ISO images") but I have downloaded much larger files (a larger collection of said "images") so it did not make sense that it could be the culprit. Besides, the torrent had already downloaded a few gigabytes already. It's only when I restarted uTorrent and the program tried to Hash-check that it started exhibiting this strange behavior.
I tried deleting the torrent data and the torrent's PartFile (~uTorrentPartFile_*.dat) to no avail. Finally, I decided to remove the torrent and opened uTorrent. Success!

So the problem's gone now but I don't have the torrent. But it turned out to be OK because I found what I was looking for as separate torrents.

Epilogue:
Too many times, I encounter the same/similar situations but forget how I solved them. I end up performing the same diagnostic steps I performed during my previous encounter with the problem(with feelings of Deja Vu). This time however, with the help of this anecdotal blog post, I hope to avoid a similar fate the next time uTorrent exhibits these perplexing symptoms. Ciao.

Tuesday, June 28, 2011

What's It Made Of

I am currently preparing a requirements document for a new in-house web-based project. While looking at various similar websites I needed to know what technologies were being used to develop them.

My methods:
A server's Web Technology is anything from the OS, Web Server, App Server, CDN, etc to the Server's Scripting language, language Framework, CMS, Client-side technologies, etc.

Sometimes, all you need to find is which language the site is coded in. This can usually be decoded from the URL in the address bar (.php, .aspx, .py). But this method becomes harder when the site uses "Clean URLs" or "SEO-friendly" URL scheme which emit webpage addresses without filename extensions or even without filenames. Even here, sometimes, it's possible to look in the anchor tags and form tags to get the "actual" URL with the file extension.

If you need to know the CMS used, you will automatically also find out the language used for the site. For example, if the site uses Wordpress as the CMS, you can be assured that the site is in PHP. Knowing each CMS' naming conventions for variables, files, etc can help here. Actually finding the CMS used in the site can be as easy as looking at the webpage for the CMS's logo or checking the HEAD Meta tags to looking at the element naming conventions in the JS & CSS resource files (Wordpress names start with "wp-") or checking the comments in the HTML/CSS/JS code.

Finding the Framework used in a site can be tricky sometimes because frameworks are meant to be highly customizable and may not contain default code that can be used to identify them. Still, checking the HEAD Meta tags and code comments can be useful and is the first thing I always do.

Identifying the client technologies in use, is of course, trivial since you have the source code right there in your browser. Javascript frameworks like JQuery, MooTools, Yahoo YUI can all be identified easily by looking at the JS files.

Tools:
Some of the information listed above is anyway sent by most of the websites to your browser as HTTP Headers. There are extensions for all major browsers that allow you to see these headers. There are also web-based tools(Web-Sniffer) for this.
Of course, a website may opt not to send these information via the headers and we have to fallback to the old and tedious methods of getting these information.

Quite recently though, I have found an awesome tool that finds out all the information listed above, and more!
To someone like me who used to go through each site's code to get these information, builtwith is like a magical panacea. Another site that I like in similar vein is CMS Detector, which, in spite of its name, detects more than just the CMS used in a website.

In truth though, a website can ask sites like builtwith from displaying its information to the public. In which case, you are back to hacking through the spaghetti code jungle that is the website's code and come up with the answers the old fashioned way.

Monday, June 20, 2011

My first BTC!

My first BitCoin:

Mining this took almost a week. A BitCoin exchanges for about 17 USD as of this writing at Mt.Gox. Hopefully, it will clamber back to ~20USD soon.

Boring specifics:
Software: Phoenix (with Phoenix Rising UI)
Hardware: ATI 5970 GPU
Mining Pool: BTC Guild

Wednesday, May 11, 2011

Ad blocked!

So, I've been working on a project and one of the tasks was to add Advertisement Banner functionality. I used custom php and javascript to create a light-weight Ad display mechanism.
The next part of the project was to add an "AD Rotator" which rotates the displayed Ad every few minutes.
I wrote this class in a separate javascript file included it in the required pages. I almost marked it as completed when I noticed that the rotation script, which was working in localhost, was not working when hosted on the testing server. I was testing in Google Chrome and the Error Console stated:
Uncaught ReferenceError:  start_adRotator is not defined only in chrome
I was pulling my hair over this for almost a day. Good 'ol google was not very helpful here. The same page from the testing server ran perfectly in IE! I mean, my code runs in IE not in Chrome?? Has the fabric of space-time begun to unravel? Is it time to deploy the Amber?...

I was willing to try anything at that point because the project's completion date was near. Time to get debugging. I noticed that my Ad Rotation script was not in the Scripts list box, but present in the "Resources" tab. Hmm... weird, but I didn't think much of it. Next, I tried the Audits tool in Chrome's built-in "Developer Tools". The audit reported that a huge chunk of javascript was not being used. A quick gander at the details revealed that these functions were from the AdBlock plugin that I had installed! *flicker flicker* *flash!*

As the sage Adam Savage once said, "Well, there's your problem!".

After adding an exception to the AdBlock extension, the code works everywhere now. You can rest easy. My lustrous and thick locks are safe, at least for now.

I guess, somehow, when you are the one hosting or displaying Advertisements, you tend not to see them as the annoying, distracting and useless pieces of bandwidth and screen-real estate wasting annoyances of the web that just begs to be blocked!

Monday, October 25, 2010

Saturday, September 11, 2010

Signs of (Road) Life

Even along the highways of the not-so-fast Indian road traffic, there can be seen quite a few imaginatively worded signs reminding drivers and riders of the speed limit and some interestingly worded cautions.

As an aside, consider why we even have these signs along the highway? I don't contend with the message of these signs but their placement. The only people who are moving slowly enough to read these signs sure don't need to read them. Not to mention that reading these signs may in fact may cause accidents (wonder why you don't see "Do Not Read Signs While Driving" signs...)
The only guys that these signs are directed at are traveling at close to warp speed anyway. So why even bother?
Perhaps the Highway Authority thinks that, on the off-chance that there is a traffic jam on the highway, these "speeders", who probably have low respect for authority and even lesser respect for anything "the man" might care to put up on a sign, might chance a glance upon them and suddenly and forever be changed?

Maybe these signs were put up so that whoever put these up can say that they tried?

Included below are the signs that I managed to capture on my way from Chennai to Bangalore (or Bengaluru). I humbly apologize for the low quality of these images beforehand. On my defense, I took these photos using a digital camera (Sony DSC HX-1) while traveling on a fast moving car, looking out through tinted windows and on an overcast day. Most of these shots were taken at dawn, dusk and at night.
Finally, after much-ado:





























For the above shots, I used the Manual mode and had to keep the ISO low and jiggle the shutter speed and aperture values so that the shots don't look grainy and, most importantly, avoid motion-blur.

And now for something somewhat different.
In the pitch black of the night, the limitations of the digital camera were becoming quite obvious. The heavy rain pouring down didn't help either. So I gave up with keeping the shutter speed high and dialed it way down low (1 to 5 secs) just for the hell of it. This resulted in quite an unexpected light-show and I was quite impressed with the results:




Wednesday, July 08, 2009

Enforcing a minimum of one check mark

Good evening, ladies and gentlemen. Here's a little code I tossed off recently in the Computer:

So, here's how I made sure that at least one check box remains checked in one of my forms. A quick search didn't turn out any vb.net samples for doing this, so here is my version so that I will have a source of copy-paste the next time I google :P

Private Sub dlgViewImageHistogram_Load(ByVal sender As System.Object, ByVal e As System.EventArgs) Handles MyBase.Load
  chkRed.AutoCheck = False
  chkGreen.AutoCheck = False
  chkBlue.AutoCheck = False
  chkIntensity.AutoCheck = False
  chkIntensity.Checked = True
End Sub

Private Sub checkColor_Clicked(ByVal sender As System.Object, ByVal e As System.EventArgs) Handles _
chkRed.Click, chkGreen.Click, chkBlue.Click, chkIntensity.Click
  Dim checkCount As Integer
  For Each chkBox As Control In pnlChannels.Controls
    If Not TypeOf chkBox Is CheckBox Then Continue For
    checkCount = IIf(CType(chkBox, CheckBox).Checked, checkCount + 1, checkCount)
  Next

  If checkCount = 1 And sender.checked = True Then
    Exit Sub
  Else
    sender.checked = Not sender.checked
  End If

End Sub


Basically, you need to set the AutoCheck property of the checkboxes to 'false' to take over control of the setting of the 'Checked' property.
Also, make sure that you have at least one check box checked. I have put all the checkboxes in a Panel control. Maybe there's a better way of grouping the check boxes, but I haven't read up on it.

Monday, June 29, 2009

Bitmap cloning is not Bitmap cloning

Almost 7 years ago, I had a pet project called 'PBrush', a small paint program created in VB 6. It was where I experimented with implementing the Undo/Redo features, image processing (emboss/invert/edge-detection, etc) in addition to the standard tools found in MSPaint.
I lost the project files when my first RAID-0 array broke and I've forgotten about it since then. (Since then I don't keep my project files on a raid-0 array.)

So, here I am today trying to get back in touch with VB.net after quite some time working in the Java/JSP platform and out of nowhere, I am reminded of the old PBrush project of mine. Not surprising really, since I had the most fun working on that project, ya know, trying different things with pixels and coming up with names for them :) Good times.

So anyway, I downloaded VB.net 2008 express edition and began coding about a few days ago in my spare times. I've already got a couple of kernel-based filters for edge-detection (Sobel. Scharr) coded down using basic loops for now. I heard directx methods will be faster for this, so I might try to re-write these functions.

For now, I am using a PictureBox for the main display and pass its Image property to my functions that will do the image processing. I had it all working in a day. But, as these filters are time-consuming, I noticed that the form would go to 'Not Responding' state after about 3 seconds of processing. This turns out to be caused by the form and its child controls staying "invalided" for too long. That is, the controls want to re-paint themselves, but can't because of the current thread is busy executing the image proessing routine.
First I tried the good old Application.doEvents()
Turns out that this method is old alright but certainly not "good". Some call for its timely death, even, and I can see their point. After adding this method call, my functions were taking about ten times more time to complete. Online documentations say that this is expected behaviour(!). Called 're-entrancy', this function halts any method that's been running, processes form messages and, get this, re-enters the function that it interupted.

Well, I said "no thank you. Don't come again" and started looking into putting the functions in separate threads instead.

Now, this is my first (not counting lab projects) project with threading. But it turns out that putting a function call into a separate thread is too easy. I used the BackgroundWorker class for doing this. Now, the functions work quite faster and the UI doesn't stay "invalidated" either. Seems perfect on the surface. But the devil is in the details, yes?

Yes. To better illustrate this point, let me give you a brief overview of the code.
So, I have a Picture box that holds the bitmap that needs processing. This control is also the one that is visible to the user, so it has to stay "validated". When an image filter is chosen by clicking on a menu item, I pass the Image object of the PictureBox to the appropriate function to be processed in the thread's DoWork method. Like so:

Dim bmpSrc as Bitmap
bmpSrc = pbMainDisplay.Image
e.Result = pdc.applyFilter(bmpSrc)


I noticed that "sometimes", when the BackgroundWorker is executing an image processing routine in the background, the Picture box will show a big red 'X' on it instead of the bitmap and an exception will be thrown stating "Bitmap region is already locked." on the line where I try to do a GetPixel().

I thought that it was simply a matter of adding a clone() method call on the Image property as this would create a new Image object instead of passing the reference to the Image. When this didn't work, I searched for every usage of the bitmap on the code and added the Clone() method call to it. Finally, when all the Image properties were adorned with the Clone() method, I ran the project again.... and met with the same problem, unchanged.

I looked up this issue online and found a similarly frustrated developer's blog entry on this. I gladly gave his method a try, like this:

Dim bmpSrc as Bitmap
bmpSrc = New Bitmap(pbMainDisplay.Image)
e.Result = pdc.applyFilter(taMain).Clone()


Unfortunately, this approach didn't work for me either.
Frustrated, I tried to either set the visibility of the Picture Box to false whenever the picture is being worked on by the background thread. But this would make the picture not visible to the user, which is ultimately not very "smooth".
So, for now, I have opted to intercept the paint events of the picture box and disallowing it whenever the background thread is busy, like so:


Private Sub pbMainDisplay_Click(ByVal sender As System.Object, ByVal e As System.EventArgs) Handles pbMainDisplay.Invalidated
If bwMain.IsBusy() Then
Exit Sub
End If
End Sub

It might be a workaround, but it's the only thing I've tried that works.


Update: 8th July, 2009

I have since moved the code that sets the Bitmap variable from the background worker's DoWork event to the method that calls the background worker instance's RunWorkerAsync() method, et voila, no stinkin workarounds required now.
It had been a matter of inter-thread chatter, I suppose, between the main application thread and the background worker.

Sunday, February 01, 2009

A bad case of upgraditis with a touch of Nehalem

The chronic (and costly) disease of upgradeitis has reared its ugly head again. And this time, it's the worst yet. The symptoms were all over the place, I just didn't realize it.

The Symptoms
It all started when my mouse finally decided to die on me. The shop-keeper said that my mouse is out of warranty period and there's no way of resurrecting my high-dpi pet. It had pined-for (and reached) the fjords. Fine, I said. It is time for a mouse upgrade anyway, I said. In the meanwhile, I am back to using my faltering old mouse.

I was researching (choices, choices) my next mouse when another peripheral, my Razer Barracuda HP-1 5.1 channel gaming headphones decided to act up and its front-left channel went silent, permanently. I was disgruntled at this and asked Razer web store for a refund instead of a replacement. I got my refund, but now I was without a high-end headphone pair. Fine, I said. It is time for a mouse upgrade and a high-end headphone pair upgrade, I said. In the meanwhile, I am back to using my old and so-so philips shp805 cans.

While all this was happening, I had noticed that my rig, especially after my new Dell 24" upgrade, was being brought to its knees by recent gaming titles. Playing at less than native resolutions and/or with decreased eye-candy might be an option for most, but not for me (of course, Crysis would be the one exception to this. For now.). This, of course, would entail upgrading the core system components, i.e., processor and video card.

My initial findings indicated that an upgrade to a quad core 45nm Penryn was the best upgrade path for now. But my current motherboard, the legendary Asus P5N32-E SLI, has iffy support for 45nm CPU s, and this of course calls for a motherboard upgrade, and possibly a memory upgrade, depending on the motherboard chipset or other factors as you will find out below.

Power Situation
Now there's another reason that enticed me to consider a motherboard upgrade, and that is the power consumption and, in-turn, the heat situation. The 680i chipset of nvidia that is in my motherboard is a sucker for watts and produces so much heat that I am still amazed that the solder hasn't melted and run off the motherboard. High power draw also decreases the UPS's on-battery time, which is made all the more important now because of the recent scheduled power-cuts happening all over Tamil Nadu. Another smaller issue is of the electricity bill. To give you a clue of why this may be a concern, take a look at the current system power draw as I type this blog post:
Free Image Hosting at www.ImageShack.us

All these reasons create a case against the current generation (65nm fabrication) mother board/CPU combination and the move toward a more recent (45nm) componentary. So, I had almost decided on the Intel Q9450 CPU on a P45/780i chipset, with the possibility of moving to a power-efficient DDR3 memory modules. And hence I went about trawling the inter-wibble for reviews on the various brands of the chipsetery. I had selected the CPU, but the choice of motherboard and memory was a lot tough to make. Inter-wibbling after a day of work, I took about a month's time looking around for the perfect deals or looking up on some fault or the other that someone in a forum would report regarding one of my components.
Many a times I had filled up the online shopping carts with a motherboard or a RAM module and come close to clicking on the checkout button. I have also asked most of the shops in Ritchie street for the best deals on the components. I heard that Intel was slashing the price of the CPUs. So I waited for that.

In the interim period of this waiting game did I hear of the arrival of Nehalem (pronounced NAH-HAY-LEM. Bet you didn't pronounce it that way till now. I wasn't either!) CPUs "next month". So I decided to wait some more. In the PC Hardware universe, or heck the entire electronics universe, those who wait will get better products at a better price, but at the price of owning an aged hardware that has also depreciated in resale value. And to live in India is to live a year in the past, as most electronic gadgetery is not deemed suitable for Indian consumption (unless at a heavy premium) until it's novelty has been well worn off in the rest of the World. But to heck with resale value, I decided to wait.

Fast-forward a couple of months, now I have committed to a Core i7 (920) CPU (previously codenamed Nehalem), a X58 chipset motherboard (MSI Eclipse) and 6GB of DDR3 low volt memory modules (G.skill 3x2GB 1600MHz 8-8-8-21 @1.6-1.65V PI-Black series). And about the ageing vide0 card that is my Geforce 8800GTX; it's predecessor will be a pre-SLI ed monster called GTX 295. Yep, after loathing SLI, mainly for its chance of not working in non-SLI-supported games and too much hot-air (literally) for non-linear performance gain (2 cards for around 150% perf increase), I am finally (and tentatively) getting on the SLI-bandwagon. Bring on the kool-aid.


Now, sharp readers might've noticed that the choice of a 130Watt TDP Core i7 processor and a bleeding-edge SLI graphics card are in direct opposition to the points I mentioned under Power Situation. But, the truth is that, because these are bleeding edge componentry, they have good power saving features that make them the very power efficient (i7 , especially, consumes lesser power than any contemporary quad-core processors out there) in IDLE state. Since my computer spends most of its time idly downloading torrents, this is the state in which it will spend most of its time. The LOAD power consumption, is, of course, is among the most power hungry of all. But I really don't care about this as the time spent in fully loaded condition is very small. You get the idea.


That leaves me with the Mouse and Headphones in my shopping list:

After my bad experience with headphones that only vibrate your noggin in the name of "bass", I am forever apprehensive of headphones with so-called "sub-woofers". I am currently researching on the subject of Headphones. Maybe, I could go for a "virtual surround" or Dolby Headphones. Still a lot to research on this subject so I will provide an update when I know more about the choices.

I am almost done with the mouse selection though. It could be the much revered game mouse, the Logitech MX518, or it could be the new champion Logitech G9. Or perhaps it could be the new-kid-on-the-block Microsoft Sidewinder mouse. Razer's DA, CH, Lachesis, etc are dead reptiles to me. While I ponder the choices, I also catch a glimpse of some great keyboards (mmm... MS Reclusa... Oooh, Razer Lycosa...). Come to think of it, I am kinda about the non-illuminated keys on my Microsoft Multimedia keyboard...

Monday, March 03, 2008

What is 24 inches, and grows on me?

...If you guessed that it is my new Dell 2407WFP-HC LCD monitor, you were spot-on. Otherwise, well, I am flattered ;)

It's been so long since I last blogged, so let me show you a few thousand words.
Here are some shots taken from my Motorola ROKR E6

Free Image Hosting at allyoucanupload.com Free Image Hosting at allyoucanupload.com


I was, and still am to an extent, averse to upgrading to an LCD especially mostly because there is not one LCD monitor that can boast of the color reproduction, viewing angle, contrast or response time of a CRT monitor. Maybe, with the future SED (technically not an LCD), OLED or Laser monitors, this can change. But, at present, CRT monitors own LCDs.

In addition, my 2407WFP-HC monitor has inverse-ghosting issues. With a bit of tweaking the color controls of the monitor, I have minimized, but not eradicated this problem. I'll probably give a call to a dell rep sometime about this issue and try my luck with a replacement monitor. I am just post-poning that bit of hassle when I get really annoyed by that effect. So far, only Counter-Strike: Source suffers most from this ailment.

But, LCDs do have their own aces up their sleeve. The things that sold me were low power consumption and widescreen aspect ratio.


If you had asked me on Jan 1st 2008 what my resolution for this year will be, I would've probably said 1920x1200 :)
If you know your definitions and resolutions, 'Full High Definition' or 'Full HD' is 1920x1080 and the 2407 can go up to 1920x1200; Gaming in greater than HD resolutions is something else.

Here we go again with the pictures:
First, some night shots of my Ye Olde Samsung 997MB 19"(r.i.p) desktop for screen real-estate comparison.
Free Image Hosting at allyoucanupload.com Free Image Hosting at allyoucanupload.com Free Image Hosting at allyoucanupload.com


And now, say hello to my new best friend:
Free Image Hosting at allyoucanupload.com Free Image Hosting at allyoucanupload.com Free Image Hosting at allyoucanupload.com Free Image Hosting at allyoucanupload.com

Note the size of the iTunes player in the CRT and in the new screen. That's the difference and that's exactly what I wanted.

Here's something else I wanted too: Widescreen HD gaming. While lotsa real-estate can make any gamer giddy as a school girl, LCDs come with built-in "motion-blur" and in some cases, like mine, a bit of (inverse)ghosting. While free motion-blur may sound like a good thing, the effect gets real annoying real soon. But, once you get over these minor gripes, the extra viewport area really gets to you, in a good way.

Free Image Hosting at allyoucanupload.com Free Image Hosting at allyoucanupload.com Free Image Hosting at allyoucanupload.com Free Image Hosting at allyoucanupload.com


Ofcourse, widescreen HD movies never looked this good in my old CRT. But, here are a few snaps from the really unworthy camera on my ROKR E6:

Free Image Hosting at allyoucanupload.com Free Image Hosting at allyoucanupload.com Free Image Hosting at allyoucanupload.com Free Image Hosting at allyoucanupload.com Free Image Hosting at allyoucanupload.com


Where do I go from here? Unless I upgrade my video card (or go SLI, tri-SLI or quad-SLI), I guess I am stuck with 24" and lower screens. All the recent video cards released after the 8800GTX (which I have) make for a pathetic upgrade.
No, the only option left for me is to volt-mod this card and OC the hell out of it. I've made up my mind on this and I've already got all of the parts, at quite some cost as they had to be bought from an US store. It's just a matter of time, lots of it.