Showing posts with label technology. Show all posts
Showing posts with label technology. Show all posts

22 December 2014

Mechanisims of trust building

It's eye-cathing to be told that The 21st C organisation will be small, decentralised and flat. The very opposite of the 20th C organisations that still cling on to power.  The article points to the software technology behind Bitcoin as having much greater potential applications. The individualising guarantee of uniqueness in the digital realm on which Bitcoin is built is something, it is argued, that can, in a sense, 'automate' trust by enabling people to have reasonable grounds to believe that other parties to agreements will play the part they agree to. This leads to the following scenario:
So the traditional corporate structure where investors, a board and a senior management team make the big decisions for a company might be challenged by an arrangement where groups of self-employed individuals with complementary skills and experience contract with each other to pursue a certain commercial project. The co-ordination, decision-making and operational matters usually handled by the corporate hierarchy would be managed by a combination of computer code and a diversity of individuals and organisations in return for material incentives such as an automatic share of profits. The trust required to ensure that all the contracted parties had the necessary skills and resources to fulfil their functions would be built into the very code and processes that facilitate the contracts just as conducting a transaction in bitcoin inherently provides the necessary guarantee of trustworthy payment.
What this helps us to see is that there is a good deal of many lager ceroporisations that is actually about enabling 'trust' and processing background information which is not core to the aims of the enterprise. It's probably the thing that triggers the common cry against bureaucracy: the sense that it is not very directly contributing to the core endeavours, merely supporting and even sometimes seeming to get in the way of 'actually doing'. It seems that this software solution could be the death knell for at least some administrators and middle management.
Precisely the same principles could, in fact, be applied to any area of common endeavour removing the need, for example, for hobby clubs to have an organising committee, political campaigns to have a central leadership, public services to have government appointed managers or even for social networks and search engines to have an office full of co-ordinators.
When we consider that one of the aims of a bureoucracy is to deal with processes fairly and even handedly, in fact without respect for persons, then this kind of solution is the ultimate in depersonalising administrative processes.

Of course the question is whether the programming can be up to it. I would guess that quite o lot of trial and error may need to be endured at first until blocks of programming which generally work for common kinds of joint enterprise are devised and can be called up and finessed relatively easily.

What will this do to the character of corporisations? Will it mean more smaller, leaner and shorter-lived enterprises? Will the ties so created be enough to form corporisations or will they be less than such?

At the moment I can't work out the answers, but I'll be watching out for developments.

See also article at RSA.

10 January 2013

Spirituality of the ordinary in these days of miracle and wonder

As a piece of cultural observation, I think that this is worth paying attention to because it seems to me to identify something that could well become a big  issue in the kind of way that 'instantness' has. The basic idea is that the proliferation of cameras and connectedness combined with the human attraction to novelty, spectacle and emotional extremes is likely to produce a stream of the extraordinary ...
 Cameras are becoming ubiquitous, so as our collective recorded life expands, we'll accumulate thousands of videos showing people being struck by lightening. When we all wear tiny cameras all the time, then the most improbable accident, the most superlative achievement, the most extreme actions of anyone alive will be recorded and shared around the world in real time. Soon only the most extraordinary moments of our 6 billion citizens will fill our streams. So henceforth rather than be surrounded by ordinariness we'll float in extraordinariness.
So what will it feel like to be 'us' -ordinary people in humdrum lives when
the improbable dominates the archive to the point that it seems as if the library contains ONLY the impossible, then these improbabilities don't feel as improbable.
So plausibility is affected. The miraculous is more imaginable, the 'laws' of physics perhaps seem more likely to be 'broken' "To the uninformed, the increased prevalence of improbable events will make it easier to believe in impossible things.".

So will that contribute to the growth of all sorts of belief and superstition? And how should we respond as Christians? I'd suggest we might find ourselves in the place of wanting to defend the regular, predictable and scientific -holding the fort in the face of those who may believe anything, as GK Chesterton warned.

Theologically, I would suggest that the Incarnation, notwithstanding its own extraordinariness, paradoxically affirms ordinariness. The extraordinary actually plays into the very ordinary and humdrum. It is precisely a choice to align with the 99% and the unremarkable and to eschew the spectacular. And even later when the miraculous figures it's understated  and woven into the everyday with Jesus on the whole trying to keep it out of the news because hype distorts and misdirects. The extraordinary tends to discriminate against most of real life. -We have an arrabon of that in 'celebrity culture'.

What I think that this may mean is that we need to help people develop an affirmative spirituality of the commonplace and everyday. This would be a mindfulness of the humdrum. More, I think learning to enjoy, take pleasure in and to be thankful about the regular 'stuff' which supports everyday life. In addition we will need to learn the discipline of curtailing envy of the extraordinariness of others, to be content and to forgo invidious comparison.

And we need to begin now to help people to grow in these dimensions of spirituality.

The Technium: The Improbable is the New Normal:

25 July 2012

Self-refuting communication: medium versus message

I just got a phone call.
A recorded voice began: "This is an important message ..."
I put the phone down muttering, "If it's so important, why is it a recorded message?"

In an age of cheap reproduction of all kinds of messages, importance is marked by personal attention, time given and demonstrable consideration of ones personal circumstances and background. An effectively broadcast message just doesn't do that.

Of course, the age of googlised personalisation where targetted but automated messages are more and more accurately deployed to personal history may change that somewhat. But probably in a way that makes a real person more valuable in conveying the message "this is/you are important". Consider the value of a handwritten letter against a printed one against a leaflet.

25 October 2010

Secure Password Solution

I'm looking at using this: I have a number of passwords and sometimes I'm not sure which goes with what, and so it may be that this
The Easy, Any-Browser, Any-OS Password Solution has a good answer and here's why: "Of all the password management utilities out there, I consider LastPass the most elegant compromise between convenience and security, and if you're not using it already, I recommend you start. It's mostly free, plugs into nearly any browser or smartphone, is KeePass compatible, and just works."
So, I'm checking it out. If you do too, let me know what you think.

13 October 2010

Chicken or egg with screen time and problems?

What I can't see in this article Screen time linked to psychological problems in children is a clear sign that the alternative hypothesis has been considered: "The results showed that more than two hours per day of both television viewing and recreational computer use were related to higher psychological difficulty scores,"
This led to the statement that restricting children's screen time might be good for them. But what if the relationship between screen-time and psychological disturbance is not cause and effect but effect and cause? Or even both signs of some other (unnoticed or unconsidered) factor that effects both of those things?
Admittedly, I'm being a bit skeptical of things that look like they could feed the 'woe woe and thrice woe' attitude to ICTs, so maybe the interpretation is right. But even if it is, then the cause needs to be known and possibly the question asked whether there are carry-overs or implications for people who have to work with ICTs for 8 or more hours a day ... ? So, my verdict: more research needed ...

PS just found a comment on this research here at Neuranthropology: A way down on the page under 'Digital'; "My guess is that screen time can be a proxy for other things going on in a home, such as parental neglect, and so the real cause of the later psychological problems is not elucidated by this study."

10 October 2010

Commonplaces of technology critique

This is a really nice round-up of what I've tended to call the 'moral panic' approach to new technology -specially the internet and all its works: Eurozine - Commonplaces of technology critique - Kathrin Passig It identifies with some examples lending plausibility, several predictable phases of reaction to new-fangled jeejaws and thingummibobs: it starts with downright dismissal, moves through "It'll never catch on" then it'll be dismissed as something that "people like us" wouldn't be seen dead with" and "It's just a fad", then once it becomes obvious that it's settling in we get onto the "end of civilisation" jeremiads (which is where we're at with the internet just now: using the internet rots minds)... And so on ...
I loved this (one of many examples of the persistence and antiquity of the cycle of responses) in this case in the early stages of 'negotiation' about the etiquettes of new technologies:
In the early days of the printing press it was seen as bad manners to give a printed book as a gift; until the 1980s there was a stigma of rudeness attached to typed private letters. The criticism of the use of mobile phones in public deems a conversation with an invisible partner – as opposed to one with a third party who is physically present – to be an unacceptable lack of respect for the people in one's vicinity. Sitting in cafés with one's laptop open is something that restaurateurs do not like to see – it gives an antisocial impression and reduces takings – yet sitting around in public with a book or an open newspaper has not caused any offence for quite some time. The unspoken thrust of these complaints is ultimately that opponents of an innovation do not want to be confronted with it without their consent.
Oh, and do have a look at this to get a further sense of perspective:
"For critics around 1870, the postcard sounded the death-knell for the culture of letter-writing, while in February 1897 the American Newspaper Publishers Association discussed whether "typewriters lower the literary grade of work done by reporters"."
This is a must-read article to put beside all those books that seem to be coming out to alarm us about the brain-rotting, civilisation-threatening effects of search-engines and blog-reading.

07 September 2010

Kindle -ready for action?

I've been keeping an eye on this e-reader malarkey. As much because I spend about 7 hours a week on trains and don't want to carry more weight in books than I have to; this kind of device would seem to be a potential help in my on-train reading to weight ratio! I am also keen on the e-ink idea because back-lit screen can't do well in very well lit positions and sometimes a train seat by the window is exactly that! I'm also of an age when holding a book's weight for extended periods can give rise to concerns about carpel-tunnel problems and RSI, so a lighter device with no concerns about positioning oneself to hold pages open is a definite plus.

However, I have other concerns. Ones I think are probably shared be those involved in scholarly pursuits. I need a device that allows me to annotate the text, highlight stuff for later retrieval and in general enable me to re-cycle stuff into further articles and arguments. It seems that the Kindle would enable this to happen: : "using the QWERTY keyboard, you can add annotations to text, just like you might write in the margins of a book. And because it is digital, you can edit, delete, and export your notes. You can highlight and clip key passages and bookmark pages for future use."

What I don't know, though, is how many formats of e-books there are and how versatile the Kindle might be with regard to them, and whether there are other devices that would get a greater thumbs up ... Anyone got any suggestions or any more thoughts?
Kindle Wireless Reading Device, Wi-Fi, 6" Display, Graphite - Latest Generation: Amazon.co.uk: Kindle Store

14 December 2009

What the web is teaching our brains

It's nice to find an article on this topic that isn't doing the moral panic thing (hand wringing and exclaim 'modern life is going to the dogs ... or hell in a handbasket). This one actually mentions the positive enhancements that our continued interaction with the technology might be producing. What's quite interesting is that one of the downsides noted was actually said about the technology called the book (and repeated when the book became a mass-produced commodity with moveable-type printing) ie memory loss; it's a feature of delegating remembering to other media.
What the web is teaching our brains - Features, Health & Families - The Independent: "While the internet enhances our brain function in some ways – his study found it boosted decision-making and complex reasoning in older people – it can also lead to memory loss. Some research suggests there may be links between excessive computer use and conditions such as attention deficit disorder, depression and anxiety in younger people."

The article goes on to look at different skill-areas and break down their skill-sets. I'd agree about gaming and peripheral vision: I have great trouble taking in all of the info on a screen of gameplay: I just don't practice enough (and have no intentions of doing so at the moment).

11 November 2009

The internet is killing storytelling | Ben Macintyre - Times Online

I think that Ben McIntyre in this article, The internet is killing storytelling | Ben Macintyre - Times Online is drawing on an article last year:
In a remarkable recent essay in the Atlantic Monthly Nicholas Carr admitted that he can no longer immerse himself in substantial books and longer articles in the way he once did. “What the net seems to be doing is chipping away at my capacity for concentration and contemplation,” he wrote. “My mind now expects to take in information the way the net distributes it: in a swift-moving stream of particles.”
If the culprit is obvious, so is the primary victim of this radically reduced attention span: the narrative, the long-form story, the tale. Like some endangered species, the story now needs defending from the threat of extinction in a radically changed and inhospitable digital environment
I think it may have the hallmarks of a 'moral panic' article. (However it's probably quite a good teaching resource ...)

There are various holes in the argument. Like the fact that much of what is happening on blogs, twitter etc is storytelling: people are narrating their own lives and those of others around them. The point about long stories is less telling when we realise that the modern novel is -well- modern: I've just been commending the Confession of St Patrick to some students, pointing out that it doesn't take long to read; people didn't write 'War and Peace' -sized tomes before printing and the turn to the introspective conscience. At least the author recognises this in the last paragraph. It's really attention span he's worried about, but I think the jury's still out on that one: we need to work out what the new technologies are doing to our sensoria in dialogue with culture and it's too early to tell for sure; my guess is that ADD aside, we have the same attention capabilities, we just use them differently and there will be upsides and downsides to that.

In fact, it's worth looking at an article, by Jamais Cascio, published about a year after Carr's which responds to the concerns in much the same way, only in more depth and naming what may be becoming the change to our mental reflexes: fluid intelligence. Like me, he's concerned that it's too early to tell for sure. However, he goes on to make a few tentative explorations of the kinds of effects mind-enhancing drugs and technologies could have; this is important territory and, given the speed of change, not too early for some of us to be developing perspectives to be able to assess the matters as they present -without the moral panic reaction of 'new/different=bad'. He ends with this intriguing couple of paras.

The bad news is that these divergent paths may exacerbate cultural divides created by already divergent languages and beliefs. National rivalries often emphasize cultural differences, but for now we’re all still standard human beings. What happens when different groups quite literally think in very, very different ways?

The good news, though, is that this diversity of thought can also be a strength. Coping with the various world-histori­cal dangers we face will require the greatest possible insight, creativity, and innovation. Our ability to build the future that we want—not just a future we can survive—depends on our capacity to understand the complex relationships of the world’s systems, to take advantage of the diversity of knowledge and experience our civilization embodies, and to fully appreciate the implications of our choices. Such an ability is increasingly within our grasp. The Nöocene awaits.

Hmmmm. Noocene sounds a bit like there's an influence from Teilhard de Chardin: noosphere ...

10 September 2009

Review: ID: The Quest for Identity in the 21st Century:

I got this because I'm interested in the issue of identity both from an anthropological kind of view and also political (ID cards etc). I also have a long-standing interest in neuroscience as an ammateur onlooker and Susan Greenfield is a populariser of neuroscience -so just the kind of author I'm likely to find useful.

This book offers seeks to address the interfaces between IT, biotech and nanotech with a view to exploring what these things singly or together may do to human identity. To do this, there is a rather intriguing big picture approach to different forms of idenntity in different socio-cultural miliuex.

I found the Anyone, Someone, Nobody and Eureka typology interesting and worth funther reflection. 'Anyone' is about collective identities born of an extreme ideology where the individual is subsumed in the collective; 'Someone' is about identity formation of extreme individualism; Nobody is the danger of 'descent' into mere sensoriness and effectively losing or not activating the higher thinking capacities; Eureka is a creative identity which seems to synthesise the best of all and avoid the worst of each,

There are some interesting discussions of belief and a use of the seven deadly sins to think about the dangers of social identities. An interesting idea but needing more development and care to convince those of us who 'do religion' on a regular basis.
ID: The Quest for Identity in the 21st Century: Amazon.co.uk: Susan Greenfield: Books

10 April 2009

Batteries grown from 'armour-plated' viruses

Now, I've grown a little skeptical about some breakthroughs that seem to promise a huge fix for environmentally-related technologies; but this actually does seem to be pretty promising. Batteries grown from 'armour-plated' viruses - tech - 08 April 2009 - New Scientist: "Compared to conventional lithium ion batteries, the biologically grown battery is environmentally friendly because much of the materials can now be made at room temperature or on ice and without harsh solvents." That fact that it's cheaper, easy and needs less engineering seems to be a winner.

25 February 2009

Flex-E-books soon ...

Now this looks like what I've been waiting for. I recall seeing this firm's previous discoveries reported a short few years back and it finally looks like they are about to produce the goodies. New Scientist records it here: Flexible electronic books to hit market soon - tech - 23 February 2009 - New Scientist and the nub seems to be this: "Plastic Logic says it has now perfected a way of printing polymer transistors onto a layer of bendy plastic - allowing the screens to flex and bounce. 'Screen breakage is the number one complaint with today's e-reader technology. Our display can take a lot of rough and tumble,'"
A bit later on we're told: "The company says it is now ramping up to commercial production of the screens, which will be just under A4 in size. "It'll be a much better e-reading experience at this magazine size - keeping layouts and graphics intact without converting them to small and unattractive formats," Eschbach claims.The device will have wireless internet connection and a touch screen, allowing use of a virtual keyboard for annotating text. In contrast"
That sounds more like it. I have no idea of cost, and I guess the first generation is going to be beyond my wallet, but it would seem to be what I've been waiting for: glad I haven't bought a Kindle ...

Now combine that technology with this one and you could have something very, very powerful: I can see classroom uses which might make smart board technology as we currently know it obsolete: "a device that lets you do something similar. Called sixth sense, it offers a way of displaying information on any surface - a newspaper or wall, for example - and manipulating it with hand gestures. "We wanted to find out if we could merge information into a sense that is always with us," says Maes's student Pranav Mistry.
The device, which is worn around the neck as a pendant, contains a small projector and webcam. The pendant communicates either with a laptop in a backpack or a smartphone connected to a remote server. The webcam monitors the user's hand gestures and conveys them to control software running on the laptop or server."

And for musos ...

01 February 2009

The self in a wired world

I was attracted to this because it seemed to be dealing with the interaction between culture, technology and the self.
This is what the contemporary self wants. It wants to be recognized, wants to be connected: It wants to be visible. If not to the millions, on Survivor or Oprah, then to the hundreds, on Twitter or Facebook. This is the quality that validates us, this is how we become real to ourselves — by being seen by others. The great contemporary terror is anonymity. If Lionel Trilling was right, if the property that grounded the self, in Romanticism, was sincerity, and in modernism it was authenticity, then in postmodernism it is visibility.
What I'm cautious, if not skeptical, about is making claims that the consciousness raised is fundamentally different to previous eras. Now, I'm conscious that recent brain research seems to indicate that people do tend to gravitate towards different perspectives in different languages which suggests that different cultures would produce somewhat different habiti, ceteris paribus. However, we should be wary of overstating the differences, and at times I think that this article does get close to that. I think that by staying on the side of saying that it is how loneliness is experienced in relation to other culturally-related phenomena it holds together; but at times it seems to strain against that as if wanting to say that those acculturated by modern communications technologies are somehow fundamentally different as if loneliness didn't exist before. What is fair enough is to say that loneliness as a fundamental human experience means differently in different societies. I'm definitely wary of the "ain't modern life awful" tendency in cultural comment: it tends to forget the problems and downsides of the previous eras in a rose-tinted romance of earlier times.

A little later we are told that technology takes away our solitude. I'm definitely not convinced. I think that it's always been the case that solitude for most people has to be sought and cultivated. Technology just means that we need a different set of techniques to gain it. But in societies where families were typically bigger and shared space smaller, solitude was not something people found readily. To find it then meant a trip to a park or the country or a church or the toilet (working class men took the newspaper), to find it now means turning off the phone.

There is a good point made here:
The goal now, it seems, is simply to become known, to turn oneself into a sort of miniature celebrity. How many friends do I have on Facebook? How many people are reading my blog? How many Google hits does my name generate? Visibility secures our self-esteem
This does offer a useful observation which has a great deal of truth to it. But wait. Is this really so different to how things have been for people before the internet; did people really not try for fame, notoriety, 'respect'? Of course not: the internet has merely offered more opportunities and a potentially bigger audience.
But we no longer believe in the solitary mind.
And a good job too: it was a myth of modernist individualism; the illusion created and nurtured by the technology called the book. It discouraged collective and corporate consciousness but didn't displace it entirely because we have to be social: we use language, we live through culture. We are in constant dialogue between individuation and socialisation; different cultures hold different ideals as to the proper kind of balance between the two; but we cannot escape that dialogue. Our sins are usually either to fail to take responsibility and to be individual or to be over-self-assertive. The trick is to learn which is which and when which is appropriate. I'm not sure that this article, in the end does much more than remind us that there are particular challenges to solitude in our culture. And this is true, it gives us a few insights to why they might be difficulties for our generations. However, I'm disturbed that it gives the impression every so often that this is unique and that our generation is in a worse state than any previous. I question those impressions.

14 January 2009

Questions to ask about a technology

These questions based on the work of Jaques Ellul, are definitely worth reflecting on as part of our work of cultural reflection. The headings are; ecological; social; practical; moral; ethical; vocational; metaphysical; political; and aesthetic.
Here are some of the questions from the section headed 'ecological'.
Letters from a Skeptic by Gregory A. Boyd: "What are its effects on the health of the planet and of the person?
Does it preserve or destroy biodiversity?
Does it preserve or reduce ecosystem integrity?
What are its effects on the land?"

02 December 2008

Ten technologies to save the climate

Could you name 10? I'm not sure I could have, but never fear, New Scientist to the rescue. Here are three that I reckon it'd be easy to miss from your list.
7. Second-generation biofuels
Making fuel from food crops is now almost universally regarded as a bad idea, encouraging deforestation and potentially leading to food shortages. But the next generation of biofuels made from agricultural waste shows real promise. Using new cellulose-cracking technologies, waste wood can be broken down into liquid fuel, and with US venture capitalists investing heavily in these technologies, it won't be long until this idea becomes a reality. However, with the global appetite for fuel on the increase, careful management of cellulose production will be vital.
8. Carbon capture
With the growth of renewable energy sources failing to keep up with world demand for electricity, finding an effective way of capturing and storing the carbon dioxide produced by power stations is one of the most important challenges we face. Investment in carbon-capture technologies has been slow to pick up, but governments around the world are starting to understand the importance of funding this research, and promising new technologies are already emerging.
9. Biochar
With predictions of climate change getting increasingly urgent, we desperately need cheap, simple and fast ways of reducing greenhouse emissions. One idea is to sequester carbon as biochar, a charcoal made from burning agricultural waste in the absence of air. Biochar is exceptionally stable and can be stored underground for hundreds of years without releasing its carbon into the atmosphere - and it improves the fertility of the soil.
10. Biogas stoves
Deforestation is a complex issue, and it's looking more and more likely that we will have to pay people to maintain forest lands. But until such a system is up and running, we will need to focus on technologies that reduce the need to cut down trees. One such technology is biogas stoves, powered by methane released from rotting organic waste, which would otherwise be released into the atmosphere. Leading the way is China, which is heavily promoting the use of biogas technologies.

14 November 2008

Is Google Making Us Stupid?

The premise is reasonable and, I think basically true: "Thanks to the ubiquity of text on the Internet, not to mention the popularity of text-messaging on cell phones, we may well be reading more today than we did in the 1970s or 1980s, when television was our medium of choice. But it’s a different kind of reading, and behind it lies a different kind of thinking—perhaps even a new sense of the self." This draws on the insight of Marshall McLuhan that the tools we use extend our bodies and so our minds, our ways of thinking. Our tools enable some things and disable others. The trick for us in analysing culture is to learn to pay attention to both.
Now this article tends, imo, towards the glass-is-half-empty thing. While it does a nice little job of pointing out a few ways that artefacts have probably altered our thinking (and therefore the clustering of neural connections in our brains; thus our brains are literally shaped by our technologies), we aren't as fully reminded that the Jeremiads about new technologies have been a part of their births and infancies for a long time too: printing was maligned as the end of rigorous thinking because, and this was predicted rightly , it meant that no longer would people remember huge passages (and in so doing usually engage in depth with the arguments of the authors).
So we have this paragraph towards the end: "The idea that our minds should operate as high-speed data-processing machines is not only built into the workings of the Internet, it is the network’s reigning business model as well. The faster we surf across the Web—the more links we click and pages we view—the more opportunities Google and other companies gain to collect information about us and to feed us advertisements. Most of the proprietors of the commercial Internet have a financial stake in collecting the crumbs of data we leave behind as we flit from link to link—the more crumbs, the better. The last thing these companies want is to encourage leisurely reading or slow, concentrated thought. It’s in their economic interest to drive us to distraction."
Admittedly, the author owns that this is a downside-view, but I think we should also be looking for the upside-view that will enable us to see the equivalents of the many positives that writing and printing have brought to us.

You see, I don't see the widening of information vistas quite so negatively. I think that I can put this in terms suggested by the Myers-Briggs Type Analysis. One of the continua for human personality they work with is that between iNtuitive approaches to taking in data, and Sensing. The former is about big pictures the latter about detail. What I find happening is that I do indeed, as the experience of those mentioned in this article suggests, tend to skim more than I used to. But this suits me: I get wider pictures of things and then move into the details as I get a sense of what the big issues are. I'm wondering whether what is changing is that scholarship etc is moving from the age of the bottom-up synthesiser to the big-picture, algorithmic. From, the Sensers to the iNtuitives, perhaps even from the Introverts to the Extroverts aswell as connectedness is more fully enabled and valued.

Are the nay-sayings merely the bleats of the formerly artefact-advantaged as new artefacts help a different set of human traits to come to the fore?

Is Google Making Us Stupid? - The Atlantic
(July/August 2008)
:
Htt El Pais and The Edge

14 May 2008

Gin, Television, and Social Surplus

Now this is one of those articles that actually seems to have put its finger on something quite important. It starts with the observation about the industrial revolution and gin consumption thus: "The transformation from rural to urban life was so sudden, and so wrenching, that the only thing society could do to manage was to drink itself into a stupor for a generation. The stories from that era are amazing-- there were gin pushcarts working their way through the streets of London. And it wasn't until society woke up from that collective bender that we actually started to get the institutional structures that we associate with the industrial revolution today. Things like public libraries and museums, increasingly broad education for children, elected leaders--a lot of things we like--didn't happen until having all of those people together stopped seeming like a crisis and started seeming like an asset."
Then it moves to a really intriguing speculation about the 20th century equivalent.
Starting with the Second World War a whole series of things happened--rising GDP per capita, rising educational attainment, rising life expectancy and, critically, a rising number of people who were working five-day work weeks. For the first time, society forced onto an enormous number of its citizens the requirement to manage something they had never had to manage before--free time. And what did we do with that free time? Well, mostly we spent it watching TV.

So what?
And it's only now, as we're waking up from that collective bender, that we're starting to see the cognitive surplus as an asset rather than as a crisis. We're seeing things being designed to take advantage of that surplus, to deploy it in ways more engaging than just having a TV in everybody's basement.

And where that takes us ...
Here's something four-year-olds know: A screen that ships without a mouse ships broken. Here's something four-year-olds know: Media that's targeted at you but doesn't include you may not be worth sitting still for. Those are things that make me believe that this is a one-way change. Because four year olds, the people who are soaking most deeply in the current environment, who won't have to go through the trauma that I have to go through of trying to unlearn a childhood spent watching Gilligan's Island, they just assume that media includes consuming, producing and sharing.

Yes! That's the point, interactivity mediated electronically. That's the thing we need to pay attention to in terms of the mentality that is formed by it.
And for the Christian faith? The end (we knew it already, really) of monological discourse -was it ever really there in scripture or are we waking from a book-induced trance? We return to dialogue, interaction and personal contact but with the enhancement of multi-media...

WorldChanging: Gin, Television, and Social Surplus:

25 March 2008

Technology and learning

In a piece of research that showed that good use of ICT in teaching can actually increase learning as measured by test results, we are told that the real insight gained is more nuanced and already known. "The key to success with instructional technology is to keep the focus on student-related outcomes and learning." That goes for all teaching and learning.
The news item is here: College Students Score Higher In Classes That Incorporate Instructional Technology Than In Traditional Classes:

24 February 2008

Development aid in ICT

It's good to see FLOSS being in the forefront of enabling the two-thirds world to develop where ICT is involved. The EU has invested some money to research this.
The FLOSSInclude project will carry out an in-depth analysis of the technical, business and socio-political needs for the growth of FLOSS use, deployment and development in the target regions. The project further aims to build on the network developed during the course of the study to promote international collaboration between the EU and developing countries.

FLOSSInclude will expand on earlier work by some of the consortium partners, such as the groundbreaking FLOSSWorld study , by providing a rich contextual analysis based on the specific expertise and country experiences of the participating organisations and countries.

In pilot efforts, the partners will implement FLOSS solutions, tools and services to ensure they are cost-effective and practical for each environment. The result will be a roadmap for future EU development research cooperation, with concrete and validated solutions for clearly identified needs. Together with a massive push in dissemination and networking, the FLOSSInclude aims to ensure a lasting impact beyond the project duration.

USAican RW Christians misunderstand "socialism"

 The other day on Mastodon, I came across an article about left-wing politics and Jesus. It appears to have been written from a Christian-na...