Archive for the ‘>Bespoke’ Category

Proprietary Probiotic Culture Generator!

Posted March 15th, 2009 on Bespoke

Yoghurt advertisements are the funniest thing on TV. First, they are marketed almost exclusively to women – apparently those of us with a Y chromosome are not deemed by the marketing gurus to delight in fermented milk and fruit chunks with quite the same enthusiasm as the fairer sex. Second is the sheer creativity and variety that they manage to apply to marketing what is a fairly standard product.

But my favourite yoghurt marketing technique of all is the super-pseudo-scientific Patented Proprietary Probiotic Cultures – the glorious amalgamations of biological jargon and pseudo-Latin, such as Activia’s “Bifidus Actiregularis” or “L. casei Defensis”. These names, carefully crafted in the bowels of some advertising agency, are calculated to suggest that what you’re getting is the latest in Yoghurt Technology – when in actuality it is merely a proprietary strain of the same bacteria you will find in any yoghurt.

But why should big food companies be having all the fun? Why can’t you and I have our own patented organisms, too? I think we can. And in order to facilitate this, I have created the Proprietary Probiotic Culture Generator. Click the button below to get your own name – perfect for over-hyping any bacteria-based food products you may wish to market in the future!

probioticgenerator_button

(Warning: due to Microsoft’s terrible support for open web standards, the above link may not render properly in Internet Explorer)

Though I doubt this will be of any use to anyone, as a matter of principle the source-code for this generator is available here, under the terms of the GNU-GPL v3.

My New Website – nfitz.net

Posted March 11th, 2009 on Bespoke

I’ve wanted for some time to create a personal home page which would serve as my online resume and portfolio – and Lo, It is Done. I didn’t want anything complex – most of my needs are already served between this blog, and my various other Web 2.0 accounts. So I wanted something simple, but at the same time unique and personal enough to stand out. I decided to create the site from scratch with good ‘ol HTML and CSS – a good exercise in itself. For some of the fancy aesthetic things I wanted to do I would also need JavaScript, so I set about learning about these languages from online sources, most notably the w3schools, and this excellent tutorial on CSS page-layout.

nfitz_net_screenshot

For the overall design I was inspired by Luis von Ahn’s homepage (seriously, is there anything this guy does which is not totally awesome?). I purloined some elements of his design, but wanted something uniquely personal. For this I decided to incorporate my hobby of writing with fountain pens, and so all the images for the links were written out by hand, scanned, and made transparent with GIMP. I think this gives the site a unique feel.

The only problem with the approach I took is that the site works with every major browser except Internet Explorer browsers older than 8. I debated for a while whether or not I actually care about this, and concluded that I don’t. If I can find easy fixes that will make it work then fine, but I’m not going to redesign the entire site just because Microsoft refuses to properly implement standards. Besides, chances are anyone who I would wish to see the site would know better than to use IE…

All in all it has been a very valuable project, and I am happy with the result. I would welcome any suggestions as to how I could improve the design or layout of the site!

Human Computation: Harnessing the Power of Procrastination

Posted February 26th, 2009 on Terry

In order to comment on a Terry post, you have to fill out a form which looks like this:

recaptcha-example

If you’ve spent any time at all on the interwebs, you’re probably fairly familiar with this kind of thing. They’re called CAPTCHAs, which stands for “Completely Automated Public Turing-test to tell Computers and Humans Apart”, a rather awkward acronym which nonetheless admirably describes their function of screening the automated scripts which might otherwise be hawking their manhood-enhancing wares on our poor, unsuspecting readers. The basic idea behind CAPTCHAs is to use a test which is easy for humans, but impossible for current AI systems - such as reading highly distorted text.

But this specific CAPTCHA used by Terry, and many other sites around the web, is of a very special variety. It is called ReCAPTCHA, and is the work of a group at CMU lead by the brilliant Luis Von Ahn, whose goal is to harness the work people do when filling out CAPTCHAs into a useful purpose. Believe it or not, every day an estimated 150,000 man-hours are spent world-wide filling out these infernal boxes! Wouldn’t it be great if that time could be spent doing something useful? That’s the idea behind ReCAPTCHA…

What ReCAPTCHA does is to combine bot-filtering with another useful project - the digitization of hard-copy texts such as old books. Modern OCR is highly accurate (>99%), but there are still cases where an OCR is unable to ready a given word accurately - usually the result of some damage or distortion to the text itself. In these cases, the given word is converted into an image for ReCAPTCHA and fed to a human, who can succeed where the computer failed. So every time you fill out a ReCAPTCHA, you are helping to digitize and preserve old books!

This is an incredibly clever example of a new field of development called Human Computation (also called crowdsourcing). The idea is to out-source certain elements of computation to humans, who can perform these tasks better than the computer. The challenge comes in creating the incentive for a human to participate. One technique, as used in ReCAPTCHA, is to harness work which is done by humans anyways, such as filling out CAPTCHAs. Another used by Von Ahn’s group is to turn the work into a game - making the incentive fun! To this end they created gwap (Games With A Purpose)- a site devoted to games which accomplish useful work, from image-tagging (useful for improving image web-search and making the web more accessible to the visually-impaired) to text-summary. By their estimates, if their image-tagging game ESP was played the same amount as popular flash games such as Bejewelled, all images on the internet would be completely tagged in a matter of months. The power of procrastination, properly managed, is truly a wonder to behold!

But the power of the Human Computation paradigm extends beyond those application in which it is explicitly designed. Many examples of internet social networking sites can be seen as a form of Human Computation. For example, sites such as Digg or StumbleUpon act as a powerful filter for vast sea of content available online - only the best content bubbles to the top (in theory anyway…). Furthermore, the large data collection of Audioscrobbler and Last.fm acts as a form of music-similarity algorithm, simply by clustering artists based on the people who listen to them. Dave previously wrote about the power of Google Trends to predict flu epidemics. There is exciting potential here.

Human Computation is an incredibly powerful idea which will continue to develop more interesting and useful applications as the techniques are developed further. If anyone can think of ways we could harness the power of procrastination to solve the problems we discuss on Terry, we could really be in business!

UBC Terry Project Blogging

Posted February 19th, 2009 on Bespoke

I have recently signed on as one of the collaborators on the UBC Terry Project Blog. My posts can be found here.

terry-143x150

The Terry project is an initiative centered on using an open-minded, interdisciplinary approach to solving the world’s problems. Amongst other things, it serves to encourage sharing and openness between students from different faculties. As well as the blog, the project hosts a series of guest speakers (including Richard Dawkins last year), as well as an annual conference of student talks modelled on the TED talks. The videos from this year’s conference can be found on the site.

Was it good for you, too?

Posted February 13th, 2009 on Bespoke

(follow up to this)

1234567890

OH-EM-GEE: An Epoch to Remember

Posted February 10th, 2009 on Bespoke

Drop what you’re doing and pay attention: The Interwebs have just informed me of something spectacular. Just over 70 hours from this moment, UNIX time will read 1234567890. Watch the countdown here. As far as I can figure this is the last “cool” time we will see before the “Unix Millenium Bug” in 2038.

400 Word Essay 3: Computer Models of Cognitive Processes

Posted January 27th, 2009 on Bespoke

I seriously considered not posting this one. Two things went wrong: I didn’t get my choice of topic in time, meaning I was assigned the negative side of a topic I would usually argue affirmatively on. Secondly, I was super busy this weekend and didn’t leave myself enough time to do a proper job. But I thought in order to maintain the intellectual honesty of this series I should post the less stellar examples along with the ones I am more proud of. It was an interesting exercise the try and argue a position I am opposed to. It’s something everyone should try at least once; I think if you don’t find it difficult you should question how secure your positions really are. So as a last disclaimer, I’m not sure how effective the following arguments are. You decide!

Topic: Human cognitive processes can be investigated by creating computational models. (CON)

Computer models, though important to the study of cognition itself, are of limited use in studying specifically human cognitive processes.

In order to be useful in the study of specifically human cognitive processes, computer models would have to function in exactly the same way – same structure, same speed and, to a certain extent, the same material. Since this is not the case with a computer simulation, we should not expect to garner much serious insight by using these techniques. The differences between a computer model and the real functioning of a human brain are currently too large. Even the most highly parallelized computers nowadays have nowhere near the parallelism necessary to properly emulate the functioning of a human brain. They also run at different speeds, which makes the way they process information vastly different. These problems mean the knowledge gained from computer models based on human will be suspect at best.

Attempting to investigate cognitive processes by building computer models creates a serious boot-strapping problem. The models created will only be as good as the knowledge we have, and so will not be very helpful in gaining new knowledge. We cannot build a good model without good research to base it on – research which must come from primary investigation of humans themselves. At best, computer models can help us evaluate the validity of theories garnered from primary research, but they will not be helpful in adding new knowledge to our understanding.

Looking into the future, as our computer models become more and more accurate, and begin to take on aspects of conciousness, the ethical dilemmas involved with experimenting on such a computer model will approach those involved with experimenting on humans. Afterall, if our computer models can achieve conciousness we will be forced to really seriously consider granting them the rights which go along with such capabilities. We will have to grapple with the fact that a concious system cannot just be tampered with as we would a modern program. It is, afterall, a mind. Therefore computer models will be add nothing of value to our research, and we will be better served investigating real humans instead.

400 Word Essay 2: Materialism and Science

Posted January 22nd, 2009 on Bespoke

Time for the second 400 word essay of the term. This week’s topic is quite a bit more philosophical: “There are things in the world that cannot be understood by science”. Those who know me will know which side I took: the negative! It’s very hard to discuss a complex topic like this in so few words – I actually went 50 words over the limit this time… don’t tell anyone!

I recently received a copy of “The Spiritual Brain” as a gift – so we’ll see if Mario Beauregard will manage to overturn my materialism (not likely – I’m 20 pages in and he’s already questioned the validity of the theory of evolution, and revealed that he is funded by the Templeton Foundation).

Without further ado – this weeks essay:


Topic: There are things in the world that cannot be understood by science. (CON)

In this paper I defend the position that the scientific method can, in principle, understand everything in the world. I take this topic to be a philosophical question of the nature of reality, separate from the pragmatic question of whether or not science ever will reveal everything – before, say, the collapse of civilization (I think it likely will not).

My position is motivated by the belief that the world is essentially materialist – there is no such thing as the “non-physical”. Commonly, objections to the ability of science to understand everything involve belief in “non-physical” entities which are entirely separate from the physical world, and therefore cannot be understood by observation and empirical testing. I don’t so much find this argument as unconvincing as I find it to be completely incoherent. If “non-physical entities” are completely separate from the physical world, then how can they have any causal effect on it? Conversely, if the physical world is causally affected by non-physical forces, then these effects are by definition measurable and therefore can, in time, be understood. It doesn’t make sense to say that a given entity can cause changes to the physical world and yet be unmeasurable.

Since the world is taken to be entirely physical, and governed by physical interactions – the scientific method is uniquely equipped to identify and understand these interactions. By studying data and determining the patterns therein, our understanding increases. Obviously each new gain in knowledge has resulted in new questions which need to be answered, but the principle remains the same. Given enough time this process would inevitably learn all there is to know (though I imagine in reality it will not).

Other objections to the ability of science to reach perfect understanding rely on interpretations of principles of quantum physics, such as the Heisenberg Uncertainty Principle. While I concede that perfect measurement is impossible, it is unnecessary for understanding to be achieved. While recognition of uncertainty may lay to rest dreams of creating perfect predicting machines based on the deterministic laws of physics, it has not stopped advances in understanding of quantum physics. Furthermore it does not prove that the universe is not deterministic, just that we can never have enough information to calculate everything at once (which is different to understanding).

Finally, I close by pointing to the many historical examples of things which were thought to be impossible for science to understand – from heavier-than-air flight to the ability to play chess to the recent gains of neuroscience into understanding of the brain and human nature. While not conclusive unto itself, this trend certainly suggests that we should be very skeptical about any claims of scientific impossibility.

400 Word Essay 1: Public Libraries in the Digital Age

Posted January 19th, 2009 on Bespoke

This term I am taking a really interesting course – “COGS 303 – Research Methods in Cognitive Systems” – which is intended as a guide to doing successful research in Cognitive Science. One of our regular assignments will be to write opinion pieces with a strong 400 word limit – a good exercise in clarity and brevity. We chose our topic from a list and must defend it within the word limit. I’ll post my essays to the interwebs so that they can be evaluated in the harshest of battlefields. Here’s the first:


Topic: The advent of the digital age makes public libraries obsolete. (Affirmative)

Current trends in technological and cultural development make it unlikely that public libraries will survive in their traditional format.

Firstly, the book itself is becoming a thing of the past. Although ebook usage has not become widespread as quickly as many anticipated with the advent of the computer, this slow adoption is beginning to accelerate with the recent development of specialized ebook readers which use electronic-paper technology, such as the high-profile Amazon Kindle, which has sold tens of thousands of units. Just as consumers are moving away from hard-copy formats in music and videos, towards electronic files, the same will happen with books once ereader technology reaches the “killer app” level achieved by the iPod for music. With the decline of the physical book will come the necessary decline of the physical library.

Secondly, the internet is creating a culture where information and files are shared freely, negating the need for public institutions to hoard and distribute books. This has already been observed in music and videos – despite their best efforts, recording companies cannot stop the inevitable free sharing of data. The same process is under way with books – Project Gutenberg makes it possible to find almost any popular public domain classic free on-line, while Google Books is doing the same with more obscure selections. Already there is a large collection of commercial books which have been scanned into digital formats and are available for download (a short search found both textbooks for this course).

The internet presents a better way to achieve the goals of libraries than physical libraries themselves – namely free and open access to information and books. Providing free access to the internet would be a more effective way of making media available than building and supporting large buildings filled with unread books. Once this fact becomes apparent to governments, it will become difficult to justify the larger relative cost of running a traditional library. Relative environmental impact is another point in favour or switching to a digital format.

Furthermore, the internet has demonstrated its effectiveness for bringing people together in a social network to share preferences within a given domain. Last.fm is a popular music sharing and discovery resource. These types of sites are popular amongst the current generation, and are a likely candidate to replace the community fostered by traditional libraries.

Altogether, trends indicate that traditional libraries will become obsolete.

References

http://www.techcrunch.com/2008/05/14/amazon-may-sell-750-million-in-kindles-by-2010-thats-a-lot-of-kindles/

http://www.salon.com/env/ask_pablo/2008/09/08/printers/

Papa LCD’s Contrast Ratio was TOOOOOOO high…

Posted January 18th, 2009 on Bespoke

My new Acer Aspire One forms the perfect addition to my workspace family. Aww – isn’t it cute!

Papa LCD, Mama LCD and Baby LCD

Papa LCD, Mama LCD and Baby LCD