How to install exfat-fuse on 10.5.8 PPC

Just a few notes for the future.

Macports installs a lot of dependencies, plus it cannot compile osxfuse (can’t use another port anymore) because osxfuse wants XCode 3.2, which doesn’t run on Leopard (only Snow Leopard). Use the binary provided on the official OSXFUSE website.

Use tigerbrew to install python 2.7 and scons. Just to install scons, Macports will install a lot of different crap, but at least it knows that python 2.7 is required to get scons to work properly. scons 2.3.4 is supposed to run on python 2.5.1 according to the documentation, but if you try you will end up with this error. Apparently ‘as’ only works with exceptions since python 2.6:
Import failed. Unable to find SCons files in:
Traceback (most recent call last):
File "/usr/local/bin/scons", line 190, in
import SCons.Script
File "/usr/local/lib/scons-2.3.4/SCons/Script/", line 76, in
import SCons.Environment
File "/usr/local/lib/scons-2.3.4/SCons/", line 56, in
import SCons.SConf
File "/usr/local/lib/scons-2.3.4/SCons/", line 199
except TypeError as e:

After you have python 2.7 and scons, you can follow the instructions on the exfat-fuse website. And it works.

Killed a Radeon HD 7950 in 2 days

I bought a Radeon HD 6870 and 7950 to help mine altcoins, thinking it would all pay off in two months.

No excitement here, these are slaves, workers, brought in to do the work my trusty olde GTX470 can’t do by itself. As such, I only took pictures of the Radeon HD 6870, but not the 7950. Well… it’s dead now. But I’m getting ahead of myself.

The first thing I noticed about the 7950 was either the cooler sucked (Sapphire Dual-X), or the chip puts out way more heat than my GTX470. Ran a bit of Metro 2033 on the thing just to satisfy my friend, didn’t feel noticeably smoother than on my GTX470 unfortunately. Also ran Crysis 2 on DX11. The added oomph from the 7950 wasn’t enough to make Crysis 2 as smooth as DX9 (on my GTX470). Overall, gaming wise, I hadn’t gone anywhere. Not that I needed extra performance in games – I hardly play games anymore.

Mining was a good upgrade over my overclocked GTX470. The Radeon HD 6870 didn’t want to work with the integrated Radeon HD 4250 in my second computer, but once I told it it was the only card for me, it set to work making LTC/FTC/CNC at 300kH/s (this is at high intensity). The Radeon HD 7950 managed to score the same amount at low intensity settings in GUIminer-scrypt (preset: 7950 low usage) but at high intensity it could pump out 450kH/s.

This made the desktop incredibly laggy, even if it was driving only one monitor, both monitors would be laggy. Seems to be a Windows issue, no wonder so many headless mining rigs run Linux. So I kept all the monitors on my GTX470 instead, and enjoyed smooth desktop operation while the 7950 cranked away.

All in all, 450kH/s+300kH/s+150kH/s is not an impressive show, when I bought the two cards, I had banked on the 6870 producing 350kH/s, and the 7950 to produce ~650kH/s. Turns out these were only peak figures achieved by people with watercooling loops.

Then I found that Powertune was throttling the 7950 down to 64% every now and then. WTF? When I buy a card with a custom cooler, I expect it be able to run at stock clocks without throttling, no matter the workload! So I raised the limit, and overclocked it, but I found that the overclock made Powertune throttle the GPU down to 64%… again. I figured the 100% of 925MHz was better than an intermittent 64% of 1100MHz… so I left it all at stock clocks, but kept the Powertune at +20%. OK, it’s now at 100% and not throttling – but I’m only getting 560kH/s, max. The Litecoin wiki said I’d get 600! (spent some time with thread-concurrency at 21712-24000, nothing got me up to 600 on stock clocks.

Nevertheless I had other concerns. It was getting really fucking hot, the GPU was reaching 83C. and loud. Freaking loud. The 6870 was working hard, and I couldn’t hear it over the background noise coming in from the open window. The 7950 made my PC sound like a blade server, and the sheer heat scared me enough to keep the case open, and the window to my room open. With the case closed, the heat from the 7950 made my Phenom II X6 just as hot as if it had been working at 100% too. I decided to leave the 7950 alone at 560kH/s and not overclock it.

I put up with this racket for two days, helped immensely by a pair of green foam earplugs and copious amounts of cold tap water on my body. In the end, I decided to give the computer a rest – I was hearing a rattling fan somewhere in there. Killed the miners (I got one block of BTB! yay) and let it idle at the desktop for a while. Then I shut the system down.

Ah, some peace and quiet. The Corsair Graphite 600T is making cracking noises as it sheds the heat, damn that was some workout. The CPU’s heatsink, despite it idling all the time, is hot. The heatplate on the 7950 burnt my finger. The GTX470 is doing just fine despite being just below the 7950 and running cudaminer (I don’t know how but Gelid’s Icy Vision custom heatsink is an incredibly good performer). Really, such peace and quiet. XFX’s 6870 is working in the other computer, reliably, making a loud whooshing sound but nothing really grating. OK, it’s time to get back to work.

I press the button, and my LED fans flash on for an instant, and quickly die. I smell something. Fuck. was that my mainboard? was that…. anything? I press again and again, the computer refuses to turn on, seems like the PSU’s short circuit protection is working. I pull out the Radeon HD 7950 and the computer boots.

God damn. So much for high quality Black Diamond chokes. So much for the Dual-X cooler. So much for Powertune. So much for mining!

Lessons learned:
1. Graphic cards should be seen and not heard.
2. Slow, steady and silent is actually preferable to fast, hot and raucous (does this mean I should mine LTC instead of the more profitable FTC/CNC?)
3. Sapphire’s Dual-X cooler isn’t all that. Shitty hardware review sites like say that the cooler keeps the GPU at 63C while being silent, without mentioning that this is all because Powertune is silently throttling it in the background. Although Sapphire’s Dual-X heatsink is “custom”, the aftermarket Gelid Icy Vision dissipates the same amount of heat silently, without any fuss. I’m sure my GTX470 can put out more heat.
4. Most importantly: mining is for suckers. Buying hardware just for mining is for real suckers.

The 7950 is going back for a refund. It wasn’t perceptibly faster than my GTX470 anyway, despite what Anandtech bench may say.

Will reducing capture resolution reduce chroma noise from my smartphone’s camera?

So I was hoping that maybe, just maybe, telling my HTC One S to capture at 4MP or 2MP would cause it to downsample the image and reduce chroma noise. It doesn’t. Chroma noise is just as loud as ever, and I’ve got the images to prove it. Images are too large, and I’m too lazy to fiddle with HTML to make them fit, so here’s a lazy directory link.

Adaptec AHA-2940U on Windows 7 64bit

Just a reminder to myself how to set this up again in case I need to do this again in the future (I hope not). Needed this to work with a Nikon LS-2000 scanner.

Firstly, download, yanked from a Windows Server 2008 64bit installation. Unzip it somewhere. Now, Add New Hardware may or may not work (in my experience, it didn’t work). What I do remember doing to get it to work is getting to the “Have Disk” button (Update Driver->Browse my computer for driver software->Let me pick from a list (even if you provide the folder it may not automatically “find” the driver despite the inf and sys files being there already)->Have Disk) and then it’ll work.

It isn’t digitally signed. But this is what it should look like:


This is where I heard of the “have disk” way:

0 Points

Sign In to Vote
Don’t feel bad Phil! I had a little trouble getting it to work too. I then left the folder on my D: drive and went through the procedure as if I were installing it as a new driver. When it wants to browse or have you tell it where the folder is located, select that you will choose from the Adaptec selections available in Windows 7. Then select “I have a disk” and direct it to drive D: or wherever you have the folder stored. It then installed the driver okay and I am running just fine. Good Luck and thanks to the person who found this fix.

More information on Especially useful is this:

0 Points

Sign In to Vote

There’s a lot of discussion here to this dilemma concerning Windows 7 and a driver for the Adaptec 2940 and I wanted to throw my two cents in just in case someone else finds this in a search for this problem.

Here’s my setup: Windows 7 Pro x64, Adaptec 2940au SCSI to control an Epson scanner GT-10000+. Just upgraded from XP x32 but needed Win 7 x64 for some back up issues I was having with my Home Server 2011. (long story, but all resolved now)

So anyway, I could never get the Adaptec SCSI driver to install. I started with “Adaptec_Aic_78xx” driver from the Adaptec site that I placed in the “Driver Store” which installed in the “Device Manager” but wouldn’t work (code 10). I had also found “Adaptec_7870_and_2906_and_2940_driver_for_Windows_7_-_64_Bit” which I tried to install with the “Update Driver” in the “Device Manager” with no results.

Then I saw something someone said about using the “Have Disk” in the “Update Driver” in the “Device Manager”. I used it and pointed it into my file that I had put into the “Driver Store”: “Adaptec_7870_and_2906_and_2940_driver_for_Windows_7_-_64_Bit” and bingo it started loading! It gave me the pop up window about an unsigned driver, which I clicked through and finished the installation. It was recognized in my Hamrick’s VueScan too and scanned beautifully!

So, there was how I got mine going. I hope it works for others trying to get their Adaptec SCSI card going.

Why Macs are a great development platform

Developing on Windows isn’t that great. If you want to code something quick and dirty, or make something that actually does something, not just waste tons of code on GUI fluff, that means forgoing the GUI, and that means having to deal with the horrible cmd.exe. I don’t know anything about reading off exit status codes in Windows, nor will I have the slightest urge to learn how to do so, because I can’t resize cmd.exe. That’s reason enough. Cygwin? A hack, and I hate typing /cygwin/drive_c to change drives. Plus, pathnames in Windows are just so long. I just don’t get the feeling Windows is a good environment to code in.

Linux. Ah, Linux. I used to use it all the time. Then I found out that instead of using it, what I was actually doing was maintaining it. Just like in Windows, there’s always something to be done. USE flags to be winnowed out (and the whole system recompiled), manpages to be read (there’s a whole lot of that), today X won’t start (there’s a lot of that too) so I spend the whole day troubleshooting it, getting desktop compositing working, messing with gtk-engines to make my desktop environment look the same across programs that use different GUI toolkits (I seriously hate that, even Windows got it right), reading through urxvt’s manpage to make heads and tails of all that crap it expects you to dump in the .Xdefaults file (because the default appearance just sucks), choosing this WM or that DE or that particular terminal, configuring/compiling the kernel, and when packages get updated… great. Especially on Arch Linux. Every now and then they make some drastic change which makes pacman spit out some stupid error, and then I have to read the News and follow the instructions “very carefully or you could hose your entire system!!!”

You get my point.

OS X is different though. My Macbook Air is a miracle sent down from heaven – if I had gotten a Thinkpad X230/X1 Carbon that would also have been a miracle albeit a smaller one. The small SSD makes me go light on the distractions like music/videos. You can do everything with just a few touchpad gestures. I can’t live without virtual desktops and Totalterminal. Sublime Text 2 not only looks like it’s part of OS X, it does so out of the box. I know Sublime Text is available for Linux, but it would be the only good looking program out of the whole bunch, and that just sticks out like a sore thumb. Windows doesn’t have an aesthetic anyway, so it would also stick out. I just keep thinking of Notepad++ anyway. The whole laptop is silent and just works. JUST WORKS. I cannot emphasize how important this is. There’s always something that isn’t working in Linux. In Windows things do work, but you have to work a bit to get them working, say, for instance, running Lenovo’s crappy homemade program just so you can switch from integrated to discrete GPU. On OS X, it’s all… invisible. Magic!

It’s magic! It is well worth the money spent on it – it’s a great tool that will serve me for a long time to come. Sturdily built, no software fusses, beautifully designed, feature rich, completely silent… I’m in love.

Oh wait. This was supposed to be about development.

Yes. So Linux is great because everything’s comprehensible. You have Makefiles, and you have the configure script, and you have gcc which does its business and calls ld after that. Simple enough. And you can have all that, with the magic of OS X! That’s the beauty of OS X – it’s like Linux, only you don’t have to work so hard to get it to work/be just the way you like it. The familiarity/maturity of the Linux development environment (why not Unix? because most Unices that I’ve worked with don’t even come with a shell that supports tab-completion by default, and have a huge space wasting ugly console font… see AIX/Solaris), and the “just works”ness of OS X. It’s a winning combination.

A few thoughts on AMD’s Piledriver/Vishera

It’s obvious through HardOCP’s Bulldozer->Piledriver IPC investigation that Piledriver is faster than Bulldozer largely because of added frequency headroom made possible by higher power efficiency. As Kanter pointed out from the very beginning, Bulldozer is a speed demon. It needs clocks to get anywhere near Intel.

Overclocking attempts haven’t been very fruitful though. Now is still not the time to buy an AMD FX processor. Over time I’m sure there will be a bit more headroom as they tweak the process, and FX processors will be able to clock higher out of the box. If we get supremely lucky, the next generation (Steamroller) might still be on AM3+!

The Upgrade Treadmill (or how I’m never satisfied)

I remember not so long ago thinking “6 cores, baby! this will be the last CPU my computer will ever need!” Then Vishera just came out today, in some cases rivaling the Core i7 3770K, and I start thinking about buying it.

The GTX470 was the bees knees when I bought it in late 2010 for a firesale price (but not as firesale as the Radeon HD 5850, which I kinda regret not buying now). I bought it because Folding@Home ran great on it. Then I found out about Bitcoin mining, which, as we all know, runs best on Radeons, and actually is a monetary benefit 🙁 Now I find myself eyeing a Radeon HD 7870 for 230EURs, roughly the same price that I bought my GTX470 for, and ~15% faster overall, which I definitely don’t need.

I went from an Antec Three Hundred to a more spacious Corsair Graphite 600T. But even though it’s more spacious, it’s also much louder, and the LEDs at night can be a bother, and there’s no way to turn off the fans without the computer overheating, especially with me Bitcoin mining.

I went from a Silverstone S12II 520W PSU to a Corsair TX850 so that I wouldn’t hear the PSU when running my overclocked CPU+GPU at full tilt. But looking back on it, the cables really make a lot of airflow problems – should’ve gone for modular.

I suspect my Samsung Syncmaster 950p CRT is making some kind of noise which makes tinnitus more apparent in my right ear. I’m going to replace it with a Eizo LCD off eBay soon (not taking any chances with my health here).

I bought a Nikon SB-26 for my Nikon F5. Now I need a cord for it or something so I can use it away from the camera. Strobist recommends guessing how much power you need to set the flash, which is fine for a person with a DSLR, but I’m on film here! I can’t afford to fuck around with overly bright/dark flash! I need to get it right on the first try! The eventual solution will probably mean more money…

I just bought my wonderful Macbook Air, and now there’s a 13″ retina MBP that weighs just 1.63kg, or 300g more than my Air?

I swear, money just keeps threatening to run out the door.

AMD Athlon XP (Barton, 2002) vs E450 (Bobcat, 2011)

Recently I got myself a cute little Thinkpad Edge E125 without an OS. It came with 2GBs of RAM, a 7mm 500GB hard drive, and a really nice keyboard. I was quite happy to be able to get a Thinkpad experience for only RM1200!

So using it, I wondered just how fast it was compared with my Athlon XP 2500+? A high performance design from 1999 with 512KB of on-die cache, 130nm process, tweaked to run at higher speeds to compete with Intel’s Pentium 4 (rapidly catching up by then) compared with a 40nm low power CPU whose main purpose in life is to be slightly better than Intel’s Atom. I wonder which will be faster?

A rather unhappy coincidence brought the Athlon XP 1.83GHz down to 1.66GHz, just about equal with the AMD E450’s 1.65GHz. I had two sticks of DDR400 capable RAM, but the VIA KT333 motherboard (which doesn’t really support DDR333) couldn’t set tRAS to 7, which is required for stable DDR333 operation of both RAM modules, only 5 and 6 (since the BIOS was probably only written with DDR266 in mind). And even then it wasn’t stable, had to give 2.7V to the RAM to make it stable. And this would’ve been OK if the FSB could run at 333MHz while the RAM was at 266MHz, but for some reason the RAM speed is always locked 1:1 with the FSB speed, even though there are separate settings in the BIOS. This is the MSI KT3V, folks. Not a particularly overclocking friendly motherboard.

So I reluctantly put the FSB down to 333MHz, and gave it the highest multiplier I could find to make up for the FSB drop. But in the end I could only get it to run at 1.66GHz.

Anyway, I started SuperPI 2M, but the 1st run, the Athlon XP was so much faster than the E450 that I thought something must be funny here. And indeed it was, it seemed AVG was updating itself in the background on the E450, eating up precious CPU. I disabled it on both computers. Since the E450 has two cores, and Windows 7 loves to bounce threads around, I set the affinity for SuperPI to Core 1. And gave Core 0 something to do, like dwm.exe and taskmgr.exe.

The second try worked very well. The Athlon XP finished in 2m 32s, but the E450 required only 1m 55s to calculate PI to 2 million places!

Then I noticed that Diskeeper was taking up 10% CPU on the Athlon XP. I disabled that, and turned Aero off on the E450, and ran the tests again. This time… the E450 was still faster.

AMD Athlon XP 2500+ at 1.66GHz, Barton: 2:32
AMD E450 Bobcat: 1:55

The results were rather amusing. I decided to compare the GPUs, but the CPU disparity would probably make its way into any 3D benchmark I’d care to run, so I just settled for using GPU-Z. One thing’s for sure, though – Minecraft is playable using the E450, and it’s not playable on my old pal, the Athlon XP. The Radeon 9500 Pro doesn’t have a problem with it – but the CPU is pegged at 100% all the time.

Left 4 Dead, meanwhile, is playable but slightly laggy at 640×480 low quality on the Athlon XP + Radeon 9500 Pro, but playable at 1280×800 on my Macbook Air with Ivy Bridge, medium quality. That’s progress.