Wednesday, January 26, 2011

Entanglement Basic Strategy

Basic Entanglement Strategy

I discovered a simple to learn puzzle game, gopherwood studio's "Entanglement"
while checking on Google's Chromium's "Web Store".

As a 1300+ player, with a record in the best 25 ever, I think I might have something to say about the basic strategy of the game.

First, the scoring is very simple: When you place a tile, you get an arithmetic progression of points per tile crossed (1 + 2 + 3 + ...). Since the total number of points progresses quadratically on the length of the paths, it makes all the sense in the world to prefer to play single-steps if that way you prepare very long paths that you hope to eventually traverse. The principle is this: let's say you have the option to place two tiles to make a path of 20 squares, either make two steps of 10 squares each, or a sequence of length 1 followed by length 19: the first case will give you 5*11*2 = 110 points, the second will give you 1 + 19*10 = 191. So, there you have it, it is most efficient to prepare one long path even while making 1-length extensions to your line. This is the cardinal rule. For the sake of example, consider a 4-tile sequence of total length 20 and individual lengths 1, 1, 1, 17: You would get 1 + 1 + 1 + 17*9 = 156 points; versus a sequence of 5, 5, 5, 5 = 4 * 5 * 3 = 60.

Whenever you have freedom to place any of your two tiles and many rotation options, chose the one that connects the longest two "free" paths [[ I call a path "free" when neither extreme is a wall ]] inbound to the square you are placing. This strategy also minimizes the total number of paths.

In the early stages, try to keep the tiles that when placed at the six corner positions would leave no standing non-free path (a corner position is one of those that have three walls). Since in the corners your options are reduced to the minimum it is worth keeping those tiles around.

Try to connect non-free paths together as soon as possible.

In the later stages, make sure you have an exit path when you get into an isolated hole. Obviously try to keep the option to go over all the hexagons, but if going to an hexagon and back would give you two short paths at the price of not joining two long paths you could traverse later, then it's preferable to forgo that hexagon altogether.

Lastly, this game is stochastic, if you aim for an absolute record, focus on joining the longest free paths and joining non-free paths and assume you will get the hexagon you need; I don't think you can reliably get 2000 points or more playing it safe, playing safe, with my current understanding of the game, I get around 500 points reliably but it is extremely hard to get more than 1000.

Fellow blogger Nathaniel Johnston blogs about the theoretical maximum score:

One non-related recommendation: For Christ's (or Alah's, Buddha's, etc) sake, please use Google's Chromium rather Google's Chrome, the fellow blogger "Manuel Jose" explains:

Tuesday, January 11, 2011

Dirk Meyer, great CEO. I don't know what happened

Dirk Meyer surely was a major determinant on the survival of AMD this time. I am clueless as to the reasons why, after proven himself over and over, he got booted.

I did not follow closely AMD these years, but I know this:

1) AMD had a license to the x86 instruction set that prevented it from outsourcing more than 50% of its chips, among other things. The divesture of what now is Global Foundries and the outsourcing of production violated many of the terms for the license, nevertheless, Intel timidly enforced it, perhaps due to market conditions, for Intel, it seems optimal to keep AMD in torpor, at the brink of death, but still alive; blocking the whole "Asset Light" would have been fatal.

2) Intel could have kept the legal proceedings about its abusive market practices going on ad infinitum and deny AMD closure and cash

3) The AMD products were utterly unattractive, now, unattractive but perhaps cost efficient, as traditionally.

4) Fusion may be exactly what the emerging market of hand top computers (now known as smartphones) and keyboardless laptops ("tablets") require simultaneously powerful graphics, computing power, low chip count and power efficiency, which are integrated by Fusion probably better than any other option: nVidia's Tegra has the graphics but still requires a processor such as an Intel Atom. Intel, to date, has never been able to do powerful graphics. ARM implementations always require graphics coprocessors. For further advantage, Fusion does x86 and can thus tap the large development knowhow around Windows (*1) and Linux for x86.

5) Global Foundries has been a great success, covering the flank of were the chips are going to come from.

It does not make sense to abruptly change leadership right when long term strategic plans are coalescing into market opportunities.

I strongly dispute the thesis that AMD was flatfooted entering the market for Tablets, while it's true it got rid of Geode, the line of low power processors, for all its might, not even Intel has consolidated in this space; in my opinion the opportunity is now when AMD has a compelling option which could potentially take the world by storm. This has been the official reason for booting Meyer, so, it merits a closer look

Let us recap the timeline of smartphones:
1) In the beginning, there was not enough carrier bandwidth nor processing power to squeeze a sufficiently powerful computer in hand tops.

a) While useful, hand tops such as the Palm Pilot, without wireless and always-on access to the world could not become mainstream
b) There wasn't an ecosystem for applications for what we used to call hand tops back then, PDAs, "Personal Digital Assistants".
c) The mobile carriers had an oligopoly of abuse of people's communication needs; even today they continue to pretend that it makes any sense to charge $0.25 for a few bytes of transfer if they are SMS; $1 for mediocre quality "ringtones"; etc.
2) For reasons not known to me, the guys of Palm did not enter the wireless world, but left the opportunity for RIMM and Blackberry open. Blackberry positioned at the top market of data-hungry customers willing to pay hard for reading their emails wherever they were; they contributed by things like their internal messaging system(*2), which would not ever make sense in a perfect world, but since it provided an alternative to the abuse of carriers, through things like that it built a new market, what we now call "smartphones".

3) For reasons I don't clearly understand, Blackberry did not become mainstream, which left the opportunity for Apple to do things right and take the world by storm. It helped that RIMM burned foolishly opportunity costs en masse by courting Microsoft so that their crap would run in the Blackberries, instead of creating an ecosystem of application developers that not even today they have clearly managed to do, despite the obvious successes of the very different approaches about the same subject from Apple and Google.

4) Apple did everything right, it provided a convergence path to its fanatic user base of iPods, a fashion, hip, cool way to show status to those who already had smartphones; and the most important two things of all, it dared to bitch-slap silly the carriers about their asinine policies and created an ecosystem of application developers

5) Even when Apple did everything right, it left open the opportunity for Google to go full-tilt with free and open alternatives, which they took to heart and did everything right on their own.

Please observe that Apple did not get late into the smartphone area, it gave itself the luxuries of letting Blackberry saturate the market and even sabotaged the Motorola Rokr, an early attempt at an wireless-iPod-iTunes-phone; this, in my opinion, because the conditions were not given yet. This is what leads me to opine it won't matter whether AMD has a zero footprint in today's handtops; it is only right now when you can make truly compelling products in this space, and if it dares to gamble big, like it did with Opteron, AMD64, Hypertransport; which it won, or like Apple that dared to treat the carriers like its bitches, it could take the world by storm. But, as I have said several times before, there is no substitute for real knowledge at the head of a company. I guess that only engineers would have deemed the chances of going the Opteron/AMD64/Hypertransport route worthy of being taken; without the real engineering knowledge of whether the opportunity is there or not, you can not gamble your life. I fear AMD is losing real knowledge, vital to assess whether opportunities for great gambles are there or not. (*3)

(*1): AMD, please ditch Windows, really. Do not get tangled and tied into a platform in strong decline, on the contrary, embrace wholeheartedly emerging ones, and while at that, the free/open ones. Riding Windows-based products nobody will ever again reach the world by storm; this is by design: Windows is the establishment and 100% of Microsoft objectives are to preserve the establishment, so it will make sure it will be able to control, curb and nip (castrate) any establishment threat and it is only with that intention that it associates with upcoming initiatives.
(*2): Where I come from, the blackberries continue to be very entrenched at the mainstream. The reason is that people learned to text through their Blackberry PINs, SMS were absurdly expensive, and internet chat networks came much later; full blown email is not yet practical even today.
(*3): I dread showmanship in leaders. Examples: Steven Jobs, before ratifying his genius has thrice-empire-builder (Apple, Pixar, and Apple again) ran Apple, his first empire, into the ground, and had a diet of humble pie for years. Carly Fiorina and her nonsensical acquisition of Compaq; Jeffrey Skilling of Enron, etc. Observe that showmanship is all it takes to convince the meek to take great risks; while real knowledge is what determines their success, showmanship is therefore bound inexorably towards disaster, and the sooner it happens (Steve Jobs), the better.

Monday, January 03, 2011

Crashing Windows or Linux

When you say "I wrote a program that crashed Windows", people just stare at you blankly and say "Hey, I got those with the system, for free" -- Linus Torvalds

I might add, "but if your program crashes Linux, that's an accomplishment on Cracking/Hacking".

Saturday, January 01, 2011

Happy New Decade!

At least for Christians...

I am amazed how it is a point of contention that this is the first day of this decade, which did not begin one year ago. Our numeration for years is defined so that it began on year 1, not year 0. So, 1 + 2010 = 2011 != 2010

Ubuntu 11.04 Natty Narwhal and Wayland, AMD/ATI, nVidia

XWindow is very old, it is not a surprise that someone daring like Mark Shuttleworth might decide it's time to move on towards Wayland, I found this bit important, about the progress this project shows: "[it] is sufficient for me to be confident that no other initiative could outrun it".

Natty Narwhal will be the first step away from XWindow, I personally have not began using 11.04 but I soon will, so, I don't know much how far they are going in this step, but everyone, from the developers summit on, reports this is the most daring release of Ubuntu ever.

I wanted to check on nVidia's support for Wayland, since I have been buying only nVidia cards since I last had the disappointment with the ATI HD 2600XT 512MB back in October 2007. It turns out at nVidia they don't have plans to support Wayland. What a turn off! As a heavy user of video cards (I program CUDA and occasionally also 3D game) I am disappointed that we Linux users have to fight so hard with suppliers for them just to enable their stuff on the new projects; I don't quite understand the attitude either: If, let's say, ATI would come and say "we will support Wayland and will help make sure support is there for the time Ubuntu is based on it", I would switch, and begin learning their equivalent of CUDA, participate on their fora, etc., and my influence in my circle of friends will steer business to them. I mean, we are customers they want on their side because we are "leading edgers", "trend setters", you get the point.

Moving to ATI would be a big step for me, for the HD 2600XT, I hated that the Windows drivers were Catalyst-based 'cos Catalyst relies on .net and were incompatible with everything. I kept that sucker around to check the evolution of drivers in Linux, but unfortunately, I never installed it again, I must have it somewhere. In any case, the experience was very bad in Windows; and I got negative feedback about ATI about how BAD their products were based on my experience, it takes something important like supporting Wayland for me to revert.
But there are many reports that ATI (AMD) has really caught up in Free and proprietary drivers for Linux, so, I might buy an ATI card after 3 1/2 years,
I am trying to give them the chance.

Check this video about what's coming in Natty: