Transportation Planning

Transportation Planning

Postby garethace » Thu Sep 08, 2005 9:29 pm

The interaction of transportation systems, and the environment. Is something I have become interested in. I first ran into the issue, while looking at 'pedestrianisation', versus the 'car-in-the-city'. But, I can see, the debate is much wider. Here is a piece written by Olivia Mitchell, in an article in the Times, which reinforced, my idea, that coordination of minds is called for, not just coordination of projects.

The absense of such a body, which would operate to an agreed agenda and establish spending priorities, means the many agencies involved in Dublin transport continue to operate in isolation. As far as the tunnel is concerned, it seems there has never really been any widely agreed and accepted consensus about its expected role and contribution, and about how it would mesh with traffic flows and other transport projects in the greater Dublin area. The result is that it now seems to be creating as many problems as it solves, and all these problems have costly solutions.


But while I accept the desire, for a singular, cohesive body, which would provide all of the above, I am not sure, she has the form of this idea right. I mean, I think there is some intangible element, some collective coming together of minds, which is really important to have - as I said, the closest I can come to it, is better communication, between disiplines who have virtually drawn boundaries around themselves. The work and effort, to break down those barriers, might be just enough, and better than this one, cohesive, United Nations, kind of idea. Down below, I have quoted a piece about Linux Torvalds, bridging the gap, that exists, between hardware and software, in computers. But this quote from Charles Landry, on the RTE radio talk show series, is worth posting up, to put it in context.

City making is a complex art, involving many disiplines, both soft and hard.

The monopolies, [who design the cities currently] as the urban act unfolds, only deal with the bones and skeleton, rather than the blood and tissue.

Highway engineers - unusually with bad guidelines and criteria, define how cities work.

Adapt and re-shape cities to suit the needs of the car.

People who understand psychology are not around the table.


I always like to compare the art of making cities, with the process of making IT infrastructures, and there, a similar need for communication exists. What both Charles and Olivia have highlighted above, is like in the computer scene, where different companies value their own Intellectual Property. Because, that property, makes up a large part of their capital, wealth and importance. I wouldn't blame the various agencies, involved with transport planning, for wanting to keep their own Intellectual Property. That way, it can remain pure and undiluted by other external interests. There is something to be said for that. There is also something to be said for the mixture of different interests, but I think definition of things is needed. I am not even sure, what powers, a central transportation body, could have to go into the various agencies and extract that intellectual property - they probably wouldn't even recognise the wheat from the chaff.

The open source movement in computer software, is about sharing intellectual property, in a way that can benefit everyone, and enable you, to build much better infrastructures. As Linus Torvalds claims he is not interested in open source software, as a means to making 'free software'. But that the open source development model, represents the best way to engineer good solutions, for complex problems. I think, this movement in computing, provides a much better model, for how to go about transportation planning, and coordination of different transport agencies, than the way, that Olivia suggests above. Which was a United Nations council to gather information from the various ajencies. If you look at the United Nations today, it is often high-jacked for use as a propaganda platform. Krushev, hammering a table with his shoe springs to mind. Below, is a quote, from 'Rebel Code'.

Brian O' Hanlon

Linus then went on to make an important point. The Transmeta platform is 'also a very cool vehicle for doing debugging,' he said, because 'when you control the whole chip, there are lots of interesting things that can be done.' That is, because the x86 processor was created by Transmeta's new Code Morphing Software - which he helped to write - Linus was able to get inside the processor and examine and even hack around with the way the Intel family worked; this was a powerful and unprecedented mechanism for software designers.

Even Ditzel had underplayed this aspect, limiting himself to an interesting anecdote. It concerned a Transmeta customer in Japan who needed a bug fixed in the processor itself (chips are in some ways just software that has been turned into silicon and need to be debugged like programs). 'Normally,' Ditzel explained, 'to get a new CPU [processor chips] would take weeks of fabrication time, testing, and shipping it to them.' The design of the chip would have to be modifed, and then new realisations in silicon produced, tested, and sent out ot Japan by air.

'What Transmeta did was to send them a new CPU over the Internet. In fact, we simply e-mailed it to them,' Ditzel explained. This was possible because bugs in the silicon could be worked around by modifying the Code Morphing Software. Sending updates to the Code Morphying Software was as simple as sending a patch to any piece of software. 'Crusoe is the only CPU that is software-upgradeable over the Intenet,' he went on.

This was the real innovation of Transmeta: The company had managed to turn the closed, black-box chip of Intel into a hackable piece of technology. It was half way to producing a chip that could be changed at will; it would need only to release the source-code to the Code Morphing Software and anyone could reprogram the chip - in just the same way that anyone could reprogram the Linux kernel to suit a particular need.

Not that Transmeta was contemplating such as step. One of the ironies of Linus's move to Silicon Valley was that it saw him working for a hypersecretive company that produced close-source products. The point was that it had come up with a radically new approach that included what might be called the open-source chip as a possibility.
garethace
 
Posts: 1579
Joined: Wed May 14, 2003 9:01 pm
Location: Dublin, Ireland

Re: Transportation Planning

Postby jimg » Wed Oct 05, 2005 1:37 am

This is not really archictecture related but I've wanted to write something about this subject for a while and why not here. I may actually expand this a bit and put it somewhere else - I'm not sure an Irish Architecture messageboard is a natural host for it.

The Transmeta idea sounds far more innovative than it turned out to be in practice. The idea was pretty trivial to anyone with a historical/technical perspective of IT and the evolution of hardware and software. From the software side virtual machines were in vogue for a period in the 70s. These programs simulate the operation of another particular processor and can run programs designed (or compiled) for such on a different computer with different features. The idea generated plenty of interest for a while (particularly P-code) but died because of a basic flaw. Despite the notional appeal of the idea, it is no more difficult to write a "native compiler" (i.e. a program to convert a high-level description of a program into the 1s and 0s that the hardware can understand) than it is to develop a "virtual machine" but the former will always allow programs to run far faster and more efficiently. I'm being deliberately superficial concerning the precise technical nature of compilers and virtual machines.

So it was an idea that came, generated interest for a while, was discovered to have flaws and more or less quietly died in the late 70s/early 80s. Then the idea was rebranded as something magical in the earlyish 90s when a very unimpressive language called Oak was renamed Java by Sun and captured the imagination of an IT community by piggybacking on the emerging mania for the world wide web (this was a few years before it became a public phenomena). At the time web pages were very basic and Java was first used as a way of adding exciting features like animations, sounds, dynamic menus, and the like. It did this after a fashion but quite poorly and was soon superceded by better alternatives and now is hardly ever used on web pages. Before people realised that this virtual machine idea was flawed (there was general grumbling about it being "slow" but the fundamental idea was never challenged), Sun had moved on and started marketing it as a general purpose language for writing user applications. The original excitement followed even as Java on the web was failing and dying. As it started to prove to be a poor tool also for writing GUI based applications, the focus switched to the "back end" - i.e. on the servers. It currently has carved a niche here but only by using amazingly complicated "virtual machine" implementations ("hotspot" compilers and the like) to compensate and through the huge mindshare created by the initial hype and excitement. Basically the same flaw which saw the idea abandoned in the 70s dogged Java from the start. To be fair, the language has evolved significantly which generally disguises the flaws in it's reason d'etre. Microsoft are now offering a competitor fo sorts having seen Java gain ground but are pragmatic enough to make use of their "virtual machines" optional.

In the same way that - until the mid 20th century - big wars in Europe seemed to happen every 25 or 30 years as, I imagine, a new generation came along and promptly forgot what was learned by the previous and felt that going to war might be a good idea, the IT world is full of people with no interest in history. This is why failed ideas are surprisingly easily repackaged and sold if an appealing metaphor can be invented to carry them even if there are obvious technical flaws. And, no, there is no point telling people about the previous incarnation - usually the metaphor often has taken on a life of its own.

On the hardware side, re-programmable chips have been around for decades with varying degrees of complexity and sophistication and have always had an important niche if not as the main processors on computers. Meanwhile the historical trend since the 80s has been for the all processor chips to become simpler (even Intel has bowed to the trend buy carefully architecting their Pentium processors) as "native compiler" technology has advanced and taken up the slack of converting software expressed in high-level "languages" into the low level bits that processors can execute efficiently.

Like Java on the web, Transmeta sounded better than it was because you could make appealing claims about "updating the processor by an email attachment" (in Java's case it was "write once, run anywhere"). In reality the claims are pretty vacuous and the technology offered little real advantage - they are solutions looking for problems. It may not be as exciting to talk about getting a fix for your broken native compiler in an email so that your programs get converted without bugs appearing in them but it's pretty much equivalent from a software point of view. So Transmeta used the re-emerging fashion for virtual machines which Java kindled and perfectly traditional hardware techniques but found a beautiful metaphor to make the idea sound revolutionary. As it turned out the metaphor proved to be a bigger breakthrough than the actual idea.

Unfortunately for Transmeta, while Java got it's initial leg up by hitching onto the exploding world wide web, the technologies they chose to piggy back turned out to be duds - i.e. tablet PCs, low powered laptops and the like. The idea had no redeeming technical advantage and without the hype (and critical mass that Java generated) is dead.

The link here to Linus Torvalds and the "open source" movement is that Linus took up a position with Transmeta. Linus unfortunately has demonstrated, through his well publicised opinions about this and that related to technology, that he is intellectually shallow and was basically extremely lucky with Linux. Nothing he has said or done since releasing the first verions of Linux and generating excitement behind it has been impressive. The growth of Linux has been largely the result of the effort of others. At this stage, there is an emerging feeling (though few articulate it) that Linus is out of his depth technically and as a leader. Again without historical perspective, Linux may seem miraculous or revolutionary. I use it all the time and have both been involved in building large IT systems on it (involving thousands of computers) and very small computers using it - so I'm a big fan.

However the idea that "open source" is something new couldn't be further from the truth. The software world was traditionally "open source" - you bought the source with the program. It was also shared and free - most of Unix was developed as a colaborative effort from the 70s on. Endless software has been produced and given away for free, improved apon by others and given away again. Most of the software behind the internet (on the servers), email and the world wide web was developed this way and even earlier most of the first very basic tools were developed this way too. So, the only way that "open source" and collaborative distributed development is new is in the way that it has developed a profile outside of the nerdy world of computers in the last few years. On the other hand, Linus was lucky because at the time the "real" Unix (BSD Unix) was caught up in a weird legal battle (which would be too complicated to explain here but basically a tiny core - probably less than 10% - of an entirely free and collaboratively developed operating system was claimed to be proprietary). By the time it's status was extricated from the legal system so that version could become available for the first PCs capable of supporting a proper operating system (ones using Intel 80386 processors), Linux's somewhat technically inferior and certainly less robust copy of an pedagogical operating system called Minix (a crippled version of Unix used to teach operating system basics) had caught on and had generated a following. The BSD variants for PCs (such as FreeBSD) are still going strong but mostly in the background and are never mentioned in the popular press.

I'm not even going to try to make an attempt to link this to transportation planning.
jimg
Member
 
Posts: 480
Joined: Mon Nov 22, 2004 9:07 pm
Location: Zürich

Re: Transportation Planning

Postby garethace » Thu Oct 06, 2005 9:02 pm

http://www.cgarchitect.com/vb/showthread.php?t=8109

Jeff Goldblum and aliens - don't knock Java.

From a transportation point of view, it's good technology, when trying to link up to alien space ships.

:-)

Brian O' Hanlon.
garethace
 
Posts: 1579
Joined: Wed May 14, 2003 9:01 pm
Location: Dublin, Ireland

Re: Transportation Planning

Postby garethace » Thu Oct 06, 2005 9:36 pm

Just to expand a bit, by use of analogy on things hard and soft. The concept of money has always been 'hard'. Hard-cash, hard currency. In our language we have this notion of money actually standing for something real, tangible that one can practically touch, grasp and contain. Just like cities are thought to be physical, atom-based things - things you can define, cost, plan, build and subsidise. What more enlightened thinking about urban design has highlighted, is the softer issues to do with cities, the flesh and veins stuff that bind them together in many ways. So lets just talk a little about money as hardware and software, in Ireland, in the world context, or 2005. As I get older, I understand more, that we don't change terribly as individuals - but the world around you can change very radically, and set us up in different relationships with everything. These changes in history, have often been very sudden, unpredictable and unforgiving. We are finding it very difficult to discard some pre-conceptions about the way things work. Things that hark back to an earlier time when economies depended more on the working with atoms than with electronic bits. Even in traditional native industries like agriculture here in Ireland, the information associated with the carcass of meat - describing its quality, its age, its market and its origin, all the paperwork to do with subsidy etc - seem almost as important as the atoms that make up the meat itself.

“The markets conversion rate between atoms and bits has been running at about ten to one.”


Neil Gershenfeld's chapter on Smart money, in 'When Things Start to Think', is a very interesting read, I recommend it. Development of computer technology in the 80s and 90s centered around creative manipulation of words, images, sounds and movies. But computing is becoming more pervasive and people are using computing now, while on the move. They are wearing several computers on their person, without hardly noticing it. It is not just about multimedia. When you begin to get a digital equivalent for something like money, it changes the way that money can be handled and how we think about it. I would argue, that with cities you are faced with a similiar challenge - to mix up your definition - with ideas from the soft and the hard sides.

“Money now represents nothing more than beliefs”

“What the market thinks it is worth.”


Are we going to end up with a poor standard for digital money - something proprietary, thrown-together and fairly restrictive like MIDI in the music industry - or will we have a more reliable open standard. Will we have people here in Ireland, for instance, working to create ways to manipulate digital money, and improve those ways, through open collaborative working? I mean, the recent health care computing systems news breaker, tells you just how important open collaborative working might be in the future. There you have something composed of both hardware and software - the buildings, beds, heating, lighting, environment of the hospitals - and the services provided by the doctors and nurses - the diagnosis, the care and the understanding. Then you have technology, which isn't working well enough apparently, to allow all that to happen effectively.

“At the end of the gold standard, computing was done by relatively isolated mainframes. Computers were needed to record financial trends but did not create them.”


This imbalance is a recipe for disaster, because committing resources to a position is easy compared to evaluating the financial risk of a prortfolio of derivatives, which requires a detailed understanding of the state of the markets and how they are likely to move. In this unequal competition, the trading side almost always wins out over the risk assessment side. Some firms effectively give up and ask the traders to evaluate themselves. Tellingly, Lesson at Barings, and Iguichi at Daiwa, both kept their own books because their managers from an earlier generation did not feel capable of supervising them.

The solution is to recognise that the further that assets get divorced from underlying resources, the more necessary it becomes to merge spending with monitoring. Each new algorithm for valuing something must be mated to a new algorithm for assessing it. Back-end accounting then moves to the front lines, drawing on the same data feeds and mathematical models that are used for trading.


So that really, is why I think virtualisation, back-end computing of Sun Microsystems, and this whole space is important for the future. But I guess the point is, is it going to be transparent enough, so that the best brains can gain access to it, and improve its flaws. Thats a lot to ask of society - because we tend to think of stuff like money and creation of 'value' as some closed private kind of thing. I don't know, because transportation is such a large physical concept in many peoples' minds. And money is another very 'large' concept too, in everyones' mind - I think this bias, to look at money, transportation, agriculture, health care, as atom based is a hangover from the 19th century and industrial revolution.


Brian O' Hanlon.
garethace
 
Posts: 1579
Joined: Wed May 14, 2003 9:01 pm
Location: Dublin, Ireland

Re: Transportation Planning

Postby jimg » Fri Oct 07, 2005 2:02 am

Brian, your post on the the other thread contains the following claim which is actually an example of what I find irritating in the industry which provides me with a living:
The Java applets are the key. Here's why: for a program to run on a computer, it must first be translated from a language like Basic or C into the machine's native tongue. Because this translation process is incredibly time-consuming, most software comes already translated. But that means different versions have to be created for different computers. Java gets around this problem by using an intermediate language - a sort of Esperanto that is not machine specific but that can quickly be interpreted by any computer.

Reading this, it sounds like a very attractive idea. You've presented a metaphor involving mobility, an existing Tower of Babel, herculean tasks, etc. My difficulty with this is that the metaphor lives and is propogated because of this appeal and has very little to do with the technical reality of the situation. For someone who works day-in and day-out with these technologies, I'm easily irritated by this. The reason I picked out the above is because I don't really even have argue that Applets WILL fail, they already have been proven to be a failure; they've died out completely years ago. So despite the appeal of the metaphor, it's the technical reality which eventually decided the fate of Java Applets and that fate was quite a quick death. Much more mundane things replaced it: animated GIFs, Javascript menus and dynamic HTML. None of these technologies have such appealing metaphors; none claimed to be revolutionary - they just entered the free market of techical solutions and slowly proved their value.

Because it's a fast moving field, the list of once-hyped failures is huge. One part of my career involved extricating software systems from THE revolutionary idea of the late 80s and early 90s - CASE or 4th generation languages. Never heard of it? I'm not surprised - the industry is fickle - but imagine this; a tool that writes software for you - you don't have to be a programmer; you describe, graphically, the structure of your organisation - painting the entities and the relationships between them; you "draw" the flow of information through it; you express rules of your business in an English-like language; this all happens in a single easy-to-use graphical environment; at the end you press a button and it not only generates the programs for your desktop PCs, it generates the mainframe and server programs, creates a suitable databaes, generates all the glue to connect all the bits together and configures everything too. These tools did exist and nothing I've described above is factually incorrect but what I've given you is the "story" to use marketing speak. It was all anyone could talk about until a few years before they moved onto the internet and web a few years later. Yet the very idea has a very fundamental flaw which should be obvious if you think about it. The idea only works if you accept that pictures are more powerful means of expression than text. This is appealing if you want to believe that simpler is better but it is patently false; pictures are good for expressing simple ideas (for example to children and Neandearthal troglodytes) while the later development of linear abstract symbolic written language (i.e. text) has made possible the development of literature, logic, mathematics and sophisticated computer programming and is arguably the basis of the awesome degree of human development over the last 10,000 years.

I can understand peoples' yearning for simple solutions to complicated problems (i.e. that you could just "paint" your organisation or desires and have a computer generate suitable software for you) or the desire to believe that, for example, there is a simple solution to becoming wealthy (e.g. by joining a Ponzi/pyramid scheme) or that there is a simple answer to feeling somewhat unsatisfied with life (e.g. by participating in religious ceremonies). Yet I still believe that the belief in such answers is naive and the peddling of such ideas is insidious.
jimg
Member
 
Posts: 480
Joined: Mon Nov 22, 2004 9:07 pm
Location: Zürich

Re: Transportation Planning

Postby garethace » Sat Oct 08, 2005 9:17 pm

I am going to respond to the above shortly, I need a chance to sit down and read through it carefully. But while I am here, I will just outline a couple of additional bits about my interest in design and computers. To give you some background perhaps on where I might be coming from. In the past few months I have looked at the writing of Eric S. Raymond, in the Cathedral and the Bazaar. I have looked at Richard Dawkins and his ideas in the Blind Watchmaker and The Selfish Gene. I have looked at what Frederick Brooks is talking about in The Mythical Man Month, and lately I have also become very consumed with reading Edward de Bono's work about Lateral Thinking and Six Thinking Hats. You know, from these references and some others, I am beginning gradually to develop some picture of where software engineers, system architects and project managers might be doing, in their world. I just find it interesting to compare with that, that architects and planners are doing in another sphere. Here is a good Edward de Bono quote.

Brian O' Hanlon.

A problem is simply the difference between what one has and what one wants. It may be a matter of avoiding something, of getting something, of getting rid of something, of getting to know what one wants.

There are three types of problem:

The first type of problem requires for its solution more information or better techniques for handling information.

The second type of problem requires no new information but a rearrangement of information already available: an insight restructuring.

The third type of problem is the problem of no problem. One is blocked by the adequacy of the present arrangement from moving to a much better one. There is no point at which one can focus one's efforts to reach the better arrangement because one is not even aware that there is a better arrangement. The problem is to realise that there is a problem - to realise that things can be improved and to define this realisation as a problem.

The first type of problem can be solved by vertical thinking. The second and third type of problem require lateral thinking for their solution.
garethace
 
Posts: 1579
Joined: Wed May 14, 2003 9:01 pm
Location: Dublin, Ireland

Re: Transportation Planning

Postby garethace » Tue Oct 11, 2005 8:46 pm

you describe, graphically, the structure of your organisation - painting the entities and the relationships between them; you "draw" the flow of information through it; you express rules of your business in an English-like language;


Yeah, didn't Fred Brooks say something like show me your flow charts, and hide your tables, and I will be even more confused. But show me your tables and hide your flow charts, and everything will be clear. Eric Raymond, underlined this, in the Cathedral and the Bazaar, by saying, good design structure and dumb code is much better than the other way around. I heard Fred Brooks saying in his Silver Bullet Essay, well worth tracking down, if you haven't already read through it - that contemporary monitor screens just aren't up to the task of graphically representing a flow diagram for software. Something like the size of a desk would be more suitable, and even then, a lot of engineers have been known to forsake the desk, for the wider expanse of the floor.

Had a good chance to read your post properly there, and absorb it properly. I had a very similar conversation not a long time ago, with someone else on Fuzzy Logic, compared with statistical methods of generating the exact same tools. Except, if you call it statistics it sounds old-hat, but at some conference, the AI bunch claimed that some billion dollar helicopter project in Asia, had been saved from a mess, just by applying 'Fuzzy Logic' to the problem. What my friend so clearly pointed out, was that describing something old, with new cool terminology was all that was going on. In your case, you are talking about something slightly different, about cool new technologies, whose ideas are seductive, displacing otherwise good technologies, whose ideas aren't at all as sexy perhaps.

By far the most entertaining novel I have read about software engineering, is by Bill Blunden, called Cube Farm, about his experiences with working on several software engineering projects, in Minnesota back in the day. Mostly, Bill specialises in keeping the legacy software applications going now. Having had such an awful experience at Minnesota, he decided that 'new technology' was not for him. Well worth picking up a copy of Cube Farm, if you do need a good laugh.

Link I just thought you might like too:

http://www.aceshardware.com/forums/read_post.jsp?id=115080617&forumid=1
garethace
 
Posts: 1579
Joined: Wed May 14, 2003 9:01 pm
Location: Dublin, Ireland

Re: Transportation Planning

Postby jimg » Tue Oct 11, 2005 10:07 pm

Brian, actually I'm probably overstating my case denigrating pictures as a means of expression but more to counter what seems to me to be prevailing idea that text is old hat. I'm a big fan of Brooks actually - he didn't get everything correct but his writings were years before their time.

And yes the term "fuzzy logic" sets my teeth on edge. However, there's an interesting story which isn't told much on why the field of AI ended up with so many highly promoted but shallow ideas (like "fuzzy logic"). It was of a result of western paranoia during the eighties. At the time the Japanese had destroyed the American car industry, consumer electronics industry and were busy buying up American assets left and right. You don't hear about it these days but sometime in the early/mid eighties, the Japanese government announce a huge research project to create a "fifth generation computer". This beast was to be able to "think" like humans. You can imagine the consternation in the West - the Japanese were invincible and were now going to take over the global IT industry. Suddenly all major western governments started pouring money into AI research. This was particular the case in the US. A lot of acedemics suddenly discovered that the work they were doing was really AI and a lot of silly ideas were suddenly pounced on by government agencies and lavished with cash. Things didn't really turn out as expected obviously - Japan is in relative (maybe terminal) decline and all the aspects of Japanese organisation and society which seemed at the time to offer them unique advantages over the West, now look like serious handicaps.

Anyway, to try to steer things back on topic somewhat and try to think of using IT for transportation planning. I've been following some of Brian's messages and have read a couple of books mentioned/recommended - in particular The Social Logic of Space and The Wisdom of Crowds. Ok. they are quite unrelated books but it's piqued an interest in the behaviour of human pedestrian crowds in response to their environment. One idea which struck me is whether anyone had seriously attempted to use computer simulation techniques to model this behaviour. Simulation is a technique where you provide very simple rules of bevahiour of individual "actors" in a system and then programatically fill an environment with instances of these acotrs. Often complex behaviour will emerge and become apparent even though the rules of the individuals are almost trivial. An example is the flocking behaviour of birds (and fish); this looks quite complicated and the "reaction" of the flock to intruders for example looks as if the entire flock is operating under a single "intelligence". In actual fact such behaviour can be simulated easily by endowing each individual with extremely simple rules.

Simulation is used extensively for traffic engineering but traffic can be modelled easily with simple behaviour models for vehicles. Pedestrians obviously (to me anyway) have far more complex individual behaviours so the problem would be far more difficult. However I did some cursory googling and it doesn't seem that anyone attempting to do this; the best I found was some fire safety research done using simulation to model how long it would take for people to escape a building. However, I'd be more interested in pedestrians in an urban environment - interacting with shops, roads, each other obviously, transport modes, parks, footpaths and other significant features and whether such modeling could be used to predict the effects of altering the environment on how crowds behave. It's something I might try to explore if I get some time.
jimg
Member
 
Posts: 480
Joined: Mon Nov 22, 2004 9:07 pm
Location: Zürich

Re: Transportation Planning

Postby garethace » Tue Oct 11, 2005 10:30 pm

Brian, actually I'm probably overstating my case denigrating pictures as a means of expression but more to counter what seems to me to be prevailing idea that text is old hat.


If you care for a very interesting exploration of just that concept, check out the first chapter in Howard Rheingold's 'Smart Mobs' where he talks about the thumb tribes over in Japan.

(Mobile phone texting, and mobile internet services)

Funny, how Fred Brooks found himself in North Carolina University, trying to do the opposite exactly, to what you described in relation to AI. I have a quote by Brooks somewhere, describing his aims in the VR research labs back in the 60s and 70s. Just give me a sec, I will fish for it.

Brian O' Hanlon.
garethace
 
Posts: 1579
Joined: Wed May 14, 2003 9:01 pm
Location: Dublin, Ireland

Re: Transportation Planning

Postby garethace » Tue Oct 11, 2005 10:37 pm

Here is just a small quote from Howard Rheingold's book, Virtual Reality. Brooks explained, that intelligence amplification had much more scope for development and use, than work happening on AI, at other universities.

Brooks sees three areas in which human minds are more powerful than any computer algorithms yet designed. "The first of these is pattern recogniition, whether visual or aural" he said. "Computer scientists don't even have good ways of approximating the pattern recognition power of a one-week-old baby uses to recognise its mother's face from an angle and with a lighting it has never seen before." In which case, Brooks believes, it is possible to multiply that power by using the computer to show humans patterns in ways they are not normally able to perceive, and let the human side of the system decide which ones are meaningful.

The second major area of human computational superiority is the realm of what Brooks calls evaluations: "Every time you go to the supermarket, you're performing the kind of evaluations that the computer algorithms we have today can only roughly approximate." The third area of human mental superiority is in the "the overall sense of context that enables us to recall, at the appropriate moment, something that was read in an obscure journal twenty years previously, in reference to a completely different subject, that we suddenly see to be meaningful."

According to Brooks, the three areas in which computers are more skilled than human minds are "evaluations of computations, storing massive amounts of data, and remember things without forgetting." I asked Brooks how he thought a human-computer cooperative system ought to be built, and he replied: "I think in an ideal system for tackling a very hard problem, the machine does the calculation and remembering and searching of data bases - and by calculation I mean the evaluation of some very complicated functions - while the human being does the strategy, evluation, pattern recognition, planning, and fetching information in context."

When you start to define the interface for such a system, you bring yourself to the threshold of VR. Brooks presents it as an ineluctable logic, the same logic that computer graphics pioreer Ivan Sutherland followed in 1965 when he made the first head-mounted display and mapped out the agengda for all those in the future who would seek ways to put the user inside a computer-created world, insted of peering in at at it through a narrow window.


Now if you really want to see places, where Fred's ideas are used today, just look at Tim Hubbard's ideas for a public domain project to look and analyse, organise and order the data for the human genome. They are using an online web service to try and someway annotate and 'develop' that data into something useable. This kinda brings one back to a study of Ivan Sutherland, Frederick Brooks, and many other early VR pioneers.

The pioneers like Brooks in VR, were just concerned about guiding the early steps of a new technology in the right direction. Something like Neil Gershenfeld and personal fabrication nowadays. Having left IBM, and done the operating system for the IBM 360 mainframe, Brooks managed to get a graphics system from IBM to get started in University of Northern Carolina, with early VR systems. This was back in the late 1960s and early 1970s.

Brian O' Hanlon.
garethace
 
Posts: 1579
Joined: Wed May 14, 2003 9:01 pm
Location: Dublin, Ireland

Re: Transportation Planning

Postby garethace » Tue Oct 11, 2005 10:46 pm

An example is the flocking behaviour of birds (and fish); this looks quite complicated and the "reaction" of the flock to intruders for example looks as if the entire flock is operating under a single "intelligence".


Physics Engines, yeah, I know, another new gizmo!

Brian O' Hanlon.

http://www.aceshardware.com/forums/read_post.jsp?id=115142534&forumid=1

http://www.aceshardware.com/forums/read_post.jsp?id=115142543&forumid=1
garethace
 
Posts: 1579
Joined: Wed May 14, 2003 9:01 pm
Location: Dublin, Ireland

Re: Transportation Planning

Postby garethace » Thu Oct 13, 2005 10:10 pm

Even in traditional native industries like agriculture here in Ireland, the information associated with the carcass of meat - describing its quality, its age, its market and its origin, all the paperwork to do with subsidy etc - seem almost as important as the atoms that make up the meat itself.


Sorry, couldn't resitst making the link to my own earlier post, it seems like the knowledge economy is about to get bar coded just like carcasses of meat are now.

http://www.theinquirer.net/?article=26860

Brian O' Hanlon.
garethace
 
Posts: 1579
Joined: Wed May 14, 2003 9:01 pm
Location: Dublin, Ireland

Re: Transportation Planning

Postby garethace » Fri Oct 21, 2005 9:08 pm

garethace
 
Posts: 1579
Joined: Wed May 14, 2003 9:01 pm
Location: Dublin, Ireland

Re: Transportation Planning

Postby garethace » Fri Oct 21, 2005 9:15 pm

Other very interesting articles in the back issues page:

http://www.masstransitmag.com/script/metasearchstart.shtm

Brian O' Hanlon.
garethace
 
Posts: 1579
Joined: Wed May 14, 2003 9:01 pm
Location: Dublin, Ireland

Re: Transportation Planning

Postby garethace » Tue Dec 27, 2005 7:10 pm

Simulation is used extensively for traffic engineering but traffic can be modelled easily with simple behaviour models for vehicles. Pedestrians obviously (to me anyway) have far more complex individual behaviours so the problem would be far more difficult. However I did some cursory googling and it doesn't seem that anyone attempting to do this; the best I found was some fire safety research done using simulation to model how long it would take for people to escape a building. However, I'd be more interested in pedestrians in an urban environment - interacting with shops, roads, each other obviously, transport modes, parks, footpaths and other significant features and whether such modeling could be used to predict the effects of altering the environment on how crowds behave. It's something I might try to explore if I get some time.


While cleaning up a hard drive today, I came across some old research I did myself on the Net about this a couple of years back. You may or may not find something useful there. Site and organisation written down below.

Brian O' Hanlon.

The Paper I read was:

Agent Based Pedestrian Modelling, Paper No. 61.
by Michael Batty.

http://www.casa.ucl.ac.uk

http://www.casa.ucl.ac.uk/working_papers/

Casa,
Centre for Advanced Spatial Analysis,
Unversity College London,
1-19 Torrington Place
Gower Street
London WC1E 6BT
garethace
 
Posts: 1579
Joined: Wed May 14, 2003 9:01 pm
Location: Dublin, Ireland

Re: Transportation Planning

Postby garethace » Thu Jan 26, 2006 9:13 pm

Simulation is a technique where you provide very simple rules of bevahiour of individual "actors" in a system and then programatically fill an environment with instances of these acotrs. Often complex behaviour will emerge and become apparent even though the rules of the individuals are almost trivial. An example is the flocking behaviour of birds (and fish); this looks quite complicated and the "reaction" of the flock to intruders for example looks as if the entire flock is operating under a single "intelligence". In actual fact such behaviour can be simulated easily by endowing each individual with extremely simple rules.


On that, for a real laugh, check out this early 2003 scribbling of mine here:

Virtually Simulated Battle for Middle Earth.

http://www.aceshardware.com/forums/read_post.jsp?id=80066252&forumid=1

Yeah, LOTRs was all the rage back then.

Brian O' Hanlon.
garethace
 
Posts: 1579
Joined: Wed May 14, 2003 9:01 pm
Location: Dublin, Ireland

Re: Transportation Planning

Postby garethace » Mon May 08, 2006 11:03 pm

Attachments
Example_1_1.jpg
Example_1_1.jpg (201.03 KiB) Viewed 1736 times
garethace
 
Posts: 1579
Joined: Wed May 14, 2003 9:01 pm
Location: Dublin, Ireland


Return to Irish Planning Matters