I find it very annoying that books for the Kindle are more expensive than the physical ones. I argue that their value is far lower, it is much easier to share a physical book than a Kindle one, and while the digital version has advantages, for long reading, nothing beats a handsome paper typography.
Sunday, February 13, 2011
"Those who don't understand UNIX are condemned to reinvent it, poorly." – Henry SpencerOne thing not mentioned in the controversy about ditching Symbian OS is that making an operating system is a very solved problem, just port the whole of Unix and you're done with it. If you want to subtract things from it, you better know what you are doing, 'cos you will get in real trouble.
Android is Linux, which is classic Unix at its core. Mac OS X is a BSD derivative, and BSD is a conformal Unix. Windows 7 is not a Unix, therefore it is a poorly reinvented one. All the Windowzes are the same: perhaps only merely adequate for the current fashion in the world of computing; but woefully inadequate for everything else. I would gamble hard on Microsoft not being able to do better than what they have in the last 15 years.
Posted by Eddie at 11:08 AM
I think this is the first time I write two articles about the same subject in the spawn of one day, it's that I found the perfect opposing view to better illustrate my opinions:
written in defense of the move. My followup:
Scoble says that Windows 7 mobile does not have as many apps as Android, and that's why it apparently flopped. Complete agreement here. I went the Android route when it had about 18,000 apps and the iPhone had like 120,000 or seven times more applications than Android, and merely 15 months later the number of applications is roughly 130,000 versus 300,000, clearly both numbers are dizzying, but Android is growing much faster (look at these very exponentially trending graphs). For Windows mobile to ever be successful, it would have to populate its platform with many thousands of applications, and this requires to inspire legions of developers. I am a developer. Do I want to make Android applications? -- I am dying to, I am thinking about really doing it professionally and all. Do I want to make iPhone apps? yes, those too after Android. What about Windows 7, BlackBerry, or Symbian before the news? -- absolutely not; I am utterly fed up with bad APIs, I do not want to pollute my head with a single one more. (by thew way, there is no need to learn an API to know it's bad).
Scoble then goes on to explain that Nokia's very deep distribution network with it's supplier network is the game changer and Windows Mobile is better than Android. Well, bullshit. Nokia's networks might be better than Samsung, HTC, etc., but no way it is significantly better, and many of these also make Windows Mobile devices... I think that if the promiscuous integration of Nokia hardware with Windows mobile makes it significantly better than the other Windows Mobile, what will happen is that they will progressively withdraw and leave Nokiasoft alone.
Scoble also explains that it did not make sense for Nokia to compete in an Android level playing field because "Nokia can't compete with China's brightest minds", so, he sorts of admits Nokia's hardware to be inferior to what HTC and others can do...
So, there you have it: The virtual merger works for Nokia (which is extremely improbable) and then the Windows platform will be left alone of other hardware manufacturers; or it does not work for Nokia, and it gets killed.
Posted by Eddie at 10:15 AM
Saturday, February 12, 2011
Nokia said going Android was like peeing on your pants, IMO using Microsoft has got to be worse that shitting on your pants
Nokia descended into irrelevancy and rather than learning from their mistakes, gave up its independence and chose the wrong ally.
First, the title of this article: Yes, Anssi Vanjoki said that Nokia relying on Android for its devices was like peeing on your pants for warmth in the winter, when he was executive vice president of devices and front runner to become the CEO. Then, like a friend of mine said, becoming Microsoft's bitch has got to be worse than shitting on your pants. Elop said about using Android that the "option was carefully examined, but would have left Nokia with little control over its destiny and killed its ability to differentiate from rivals" (quoted from here, got link from here). Am I the only one who thinks this is an evident oxymoron? how's any better turning Microsoft's bitch than being one of the many Android suppliers?, turning Microsoft's bitch let's you differentiate, alright, as the dumb choice.
There are several precedents that illustrate how bad the decision to dump Symbian (a nice platform, by the way) not to go Android but Windows 7 is; I will mention only one for brevity and similarity, Motorola. Apple (or Steven Jobs) screwed them with the Rokr, the precursor of the iPhone, and at one point, they only had bad products in the market, zero credibility, and looked pretty much done, a company soon to become another roadkill of emerging technologies. Fast forward a few years, and now, they have placed four extremely good products in the market: The original Droid (1) (called "Milestone" in other parts of the world), which I acquired in November 2009 and I still use, which I think was the best smartphone for a couple of months (the iPhone had greater processing power, but not so much more to compensate for the disadvantages of AT&T vs. Verizon and Android), then the Droid X came, the Droid 2, and now the Xoom. Using Motorola went in 18 months from being a shame to becoming cool.
Back when the Rokr was launched, I predicted it was going to flop, and that Apple will get in the handtop business with an iPad + Palm Pilot + mobile telephone and that this will be hugely successful because Apple would allow what nobody else dared: a platform open to developers that could take the product to the infinite of its possibilities; and the capability of creating a foundation of high quality. I was sure of the high quality, because Apple had the experiences of consumer electronics from the iPod, including video; the applications for Mac OS X; the porting of Mac OS X to x86 which strongly indicated the capability to port it to ARM, and the nature of Mac OS X itself, classical Unix plus the Apple intellectual property on graphics, media and user interfaces; although I was not certain about the freedom to make applications. I just thought that was the natural thing to do. For exactly the same reasons, the way that they apply to Microsoft, I am certain that they will continue to fail; and Apple allowed some freedom to make apps for the iPhone, for sure, but fell way shorter of what I expected or hoped for. Then, as I have repeatedly said, the opportunity for what became Android was left open, a truly free platform, and Android arose to the point of beginning to displace iOS one year after launched; and "snowballing" on the combined creative energies of multiple hardware manufacturers and thousands upon thousands of developers that leverage the great default application set developed by Google which feedsback on raving user and consumer enthusiasm.
Microsoft used all of its might for Windows 7 mobile, came to market with an otherwise compelling product, but the market rejected it as not up to the standard set by iOS and Android, it just isn't. And this is the platform Nokia chose. It is doomed to fail catastrophically. You only have to see that RIMM/Blackberry is fighting for its survival because the iPhone is entrenching and Android is growing by leaps and bounds.
In previous posts, especially "Exploration" (2) I have explained at length about why there is no substitute for real knowledge in the direction of a company. I submit another example of this, Sanjay Jha, since I mentioned the case of Motorola: he is not a marketroid, he is an engineer. I think a marketroid would choose Windows over Android, the big, established partner; versus the upcomer. We have seen the results of Sanjay Jha; we will see how Elop's Nokia will go to hell pulled by Windows
There is more to say about this theme...
(1) I recently discovered that what came to be the Droid was originally designed and developed as a Windows Mobile phone. Hard to imagine how it could have turned to be the fortunes-reversal product it was if so.
(2) From "Dirk Meyer", third footnote:
I dread showmanship in leaders. Examples: Steven Jobs, before ratifying his genius has thrice-empire-builder (Apple, Pixar, and Apple again) ran Apple, his first empire, into the ground, and had a diet of humble pie for years. Carly Fiorina and her nonsensical acquisition of Compaq; Jeffrey Skilling of Enron, etc. Observe that showmanship is all it takes to convince the meek to take great risks; while real knowledge is what determines their success, showmanship is therefore bound inexorably towards disaster, and the sooner it happens (Steve Jobs), the better.
Posted by Eddie at 11:22 PM
Wednesday, January 26, 2011
Basic Entanglement Strategy
I discovered a simple to learn puzzle game, gopherwood studio's "Entanglement"
while checking on Google's Chromium's "Web Store".
As a 1300+ player, with a record in the best 25 ever, I think I might have something to say about the basic strategy of the game.
First, the scoring is very simple: When you place a tile, you get an arithmetic progression of points per tile crossed (1 + 2 + 3 + ...). Since the total number of points progresses quadratically on the length of the paths, it makes all the sense in the world to prefer to play single-steps if that way you prepare very long paths that you hope to eventually traverse. The principle is this: let's say you have the option to place two tiles to make a path of 20 squares, either make two steps of 10 squares each, or a sequence of length 1 followed by length 19: the first case will give you 5*11*2 = 110 points, the second will give you 1 + 19*10 = 191. So, there you have it, it is most efficient to prepare one long path even while making 1-length extensions to your line. This is the cardinal rule. For the sake of example, consider a 4-tile sequence of total length 20 and individual lengths 1, 1, 1, 17: You would get 1 + 1 + 1 + 17*9 = 156 points; versus a sequence of 5, 5, 5, 5 = 4 * 5 * 3 = 60.
Whenever you have freedom to place any of your two tiles and many rotation options, chose the one that connects the longest two "free" paths [[ I call a path "free" when neither extreme is a wall ]] inbound to the square you are placing. This strategy also minimizes the total number of paths.
In the early stages, try to keep the tiles that when placed at the six corner positions would leave no standing non-free path (a corner position is one of those that have three walls). Since in the corners your options are reduced to the minimum it is worth keeping those tiles around.
Try to connect non-free paths together as soon as possible.
In the later stages, make sure you have an exit path when you get into an isolated hole. Obviously try to keep the option to go over all the hexagons, but if going to an hexagon and back would give you two short paths at the price of not joining two long paths you could traverse later, then it's preferable to forgo that hexagon altogether.
Lastly, this game is stochastic, if you aim for an absolute record, focus on joining the longest free paths and joining non-free paths and assume you will get the hexagon you need; I don't think you can reliably get 2000 points or more playing it safe, playing safe, with my current understanding of the game, I get around 500 points reliably but it is extremely hard to get more than 1000.
Fellow blogger Nathaniel Johnston blogs about the theoretical maximum score:
One non-related recommendation: For Christ's (or Alah's, Buddha's, etc) sake, please use Google's Chromium rather Google's Chrome, the fellow blogger "Manuel Jose" explains: http://www.techdrivein.com/2010/05/why-cant-we-all-use-chromium-instead-of.html
Posted by Eddie at 9:20 AM
Tuesday, January 11, 2011
Dirk Meyer surely was a major determinant on the survival of AMD this time. I am clueless as to the reasons why, after proven himself over and over, he got booted.
I did not follow closely AMD these years, but I know this:
1) AMD had a license to the x86 instruction set that prevented it from outsourcing more than 50% of its chips, among other things. The divesture of what now is Global Foundries and the outsourcing of production violated many of the terms for the license, nevertheless, Intel timidly enforced it, perhaps due to market conditions, for Intel, it seems optimal to keep AMD in torpor, at the brink of death, but still alive; blocking the whole "Asset Light" would have been fatal.
2) Intel could have kept the legal proceedings about its abusive market practices going on ad infinitum and deny AMD closure and cash
3) The AMD products were utterly unattractive, now, unattractive but perhaps cost efficient, as traditionally.
4) Fusion may be exactly what the emerging market of hand top computers (now known as smartphones) and keyboardless laptops ("tablets") require simultaneously powerful graphics, computing power, low chip count and power efficiency, which are integrated by Fusion probably better than any other option: nVidia's Tegra has the graphics but still requires a processor such as an Intel Atom. Intel, to date, has never been able to do powerful graphics. ARM implementations always require graphics coprocessors. For further advantage, Fusion does x86 and can thus tap the large development knowhow around Windows (*1) and Linux for x86.
5) Global Foundries has been a great success, covering the flank of were the chips are going to come from.
It does not make sense to abruptly change leadership right when long term strategic plans are coalescing into market opportunities.
I strongly dispute the thesis that AMD was flatfooted entering the market for Tablets, while it's true it got rid of Geode, the line of low power processors, for all its might, not even Intel has consolidated in this space; in my opinion the opportunity is now when AMD has a compelling option which could potentially take the world by storm. This has been the official reason for booting Meyer, so, it merits a closer look
Let us recap the timeline of smartphones:
1) In the beginning, there was not enough carrier bandwidth nor processing power to squeeze a sufficiently powerful computer in hand tops.
a) While useful, hand tops such as the Palm Pilot, without wireless and always-on access to the world could not become mainstream2) For reasons not known to me, the guys of Palm did not enter the wireless world, but left the opportunity for RIMM and Blackberry open. Blackberry positioned at the top market of data-hungry customers willing to pay hard for reading their emails wherever they were; they contributed by things like their internal messaging system(*2), which would not ever make sense in a perfect world, but since it provided an alternative to the abuse of carriers, through things like that it built a new market, what we now call "smartphones".
b) There wasn't an ecosystem for applications for what we used to call hand tops back then, PDAs, "Personal Digital Assistants".
c) The mobile carriers had an oligopoly of abuse of people's communication needs; even today they continue to pretend that it makes any sense to charge $0.25 for a few bytes of transfer if they are SMS; $1 for mediocre quality "ringtones"; etc.
3) For reasons I don't clearly understand, Blackberry did not become mainstream, which left the opportunity for Apple to do things right and take the world by storm. It helped that RIMM burned foolishly opportunity costs en masse by courting Microsoft so that their crap would run in the Blackberries, instead of creating an ecosystem of application developers that not even today they have clearly managed to do, despite the obvious successes of the very different approaches about the same subject from Apple and Google.
4) Apple did everything right, it provided a convergence path to its fanatic user base of iPods, a fashion, hip, cool way to show status to those who already had smartphones; and the most important two things of all, it dared to bitch-slap silly the carriers about their asinine policies and created an ecosystem of application developers
5) Even when Apple did everything right, it left open the opportunity for Google to go full-tilt with free and open alternatives, which they took to heart and did everything right on their own.
Please observe that Apple did not get late into the smartphone area, it gave itself the luxuries of letting Blackberry saturate the market and even sabotaged the Motorola Rokr, an early attempt at an wireless-iPod-iTunes-phone; this, in my opinion, because the conditions were not given yet. This is what leads me to opine it won't matter whether AMD has a zero footprint in today's handtops; it is only right now when you can make truly compelling products in this space, and if it dares to gamble big, like it did with Opteron, AMD64, Hypertransport; which it won, or like Apple that dared to treat the carriers like its bitches, it could take the world by storm. But, as I have said several times before, there is no substitute for real knowledge at the head of a company. I guess that only engineers would have deemed the chances of going the Opteron/AMD64/Hypertransport route worthy of being taken; without the real engineering knowledge of whether the opportunity is there or not, you can not gamble your life. I fear AMD is losing real knowledge, vital to assess whether opportunities for great gambles are there or not. (*3)
(*1): AMD, please ditch Windows, really. Do not get tangled and tied into a platform in strong decline, on the contrary, embrace wholeheartedly emerging ones, and while at that, the free/open ones. Riding Windows-based products nobody will ever again reach the world by storm; this is by design: Windows is the establishment and 100% of Microsoft objectives are to preserve the establishment, so it will make sure it will be able to control, curb and nip (castrate) any establishment threat and it is only with that intention that it associates with upcoming initiatives.
(*2): Where I come from, the blackberries continue to be very entrenched at the mainstream. The reason is that people learned to text through their Blackberry PINs, SMS were absurdly expensive, and internet chat networks came much later; full blown email is not yet practical even today.
(*3): I dread showmanship in leaders. Examples: Steven Jobs, before ratifying his genius has thrice-empire-builder (Apple, Pixar, and Apple again) ran Apple, his first empire, into the ground, and had a diet of humble pie for years. Carly Fiorina and her nonsensical acquisition of Compaq; Jeffrey Skilling of Enron, etc. Observe that showmanship is all it takes to convince the meek to take great risks; while real knowledge is what determines their success, showmanship is therefore bound inexorably towards disaster, and the sooner it happens (Steve Jobs), the better.
Posted by Eddie at 10:33 PM
Monday, January 03, 2011
When you say "I wrote a program that crashed Windows", people just stare at you blankly and say "Hey, I got those with the system, for free" -- Linus Torvalds
I might add, "but if your program crashes Linux, that's an accomplishment on Cracking/Hacking".
Posted by Eddie at 8:08 PM
Saturday, January 01, 2011
XWindow is very old, it is not a surprise that someone daring like Mark Shuttleworth might decide it's time to move on towards Wayland, I found this bit important, about the progress this project shows: "[it] is sufficient for me to be confident that no other initiative could outrun it".
Natty Narwhal will be the first step away from XWindow, I personally have not began using 11.04 but I soon will, so, I don't know much how far they are going in this step, but everyone, from the developers summit on, reports this is the most daring release of Ubuntu ever.
I wanted to check on nVidia's support for Wayland, since I have been buying only nVidia cards since I last had the disappointment with the ATI HD 2600XT 512MB back in October 2007. It turns out at nVidia they don't have plans to support Wayland. What a turn off! As a heavy user of video cards (I program CUDA and occasionally also 3D game) I am disappointed that we Linux users have to fight so hard with suppliers for them just to enable their stuff on the new projects; I don't quite understand the attitude either: If, let's say, ATI would come and say "we will support Wayland and will help make sure support is there for the time Ubuntu is based on it", I would switch, and begin learning their equivalent of CUDA, participate on their fora, etc., and my influence in my circle of friends will steer business to them. I mean, we are customers they want on their side because we are "leading edgers", "trend setters", you get the point.
Moving to ATI would be a big step for me, for the HD 2600XT, I hated that the Windows drivers were Catalyst-based 'cos Catalyst relies on .net and were incompatible with everything. I kept that sucker around to check the evolution of drivers in Linux, but unfortunately, I never installed it again, I must have it somewhere. In any case, the experience was very bad in Windows; and I got negative feedback about ATI about how BAD their products were based on my experience, it takes something important like supporting Wayland for me to revert.
But there are many reports that ATI (AMD) has really caught up in Free and proprietary drivers for Linux, so, I might buy an ATI card after 3 1/2 years, I am trying to give them the chance.
Check this video about what's coming in Natty:
Posted by Eddie at 1:37 PM