||Thursday 15 September 2005
The first meeting of the 2005/6 season got our annual programme off to a good start.
Our speaker, Robb Mann of QVIP, took us through the pitfalls waiting for
the unwary of outsourcing as well as the benefits. He proposed potential
solutions for the shortcomings in the process.
From case studies Robb made several points that are perhaps not obvious.
For example, outsourcing of support can lead a company into being less
knowledgeable about new and current technical possibilities. Unless it
is part of the contract it may not be in the interests of the
outsourcing company to spend time looking for better or more innovative
future solutions on behalf of their client. Whilst the relationship
starts off well future technologies and solutions are not properly
investigated and the technical environment stagnates.
QVIP own a database of successful projects against which similar
projects, either proposed or actual, can be measured. Having identified
areas where performance is suboptimal these can then be addressed.
Robb's warm method of delivery encouraged a great deal of audience
participation. The floor freely added their own experiences and thoughts
to the discussion.
A copy of Robb's slides is on our downloads page.
You can find an overview by selecting Smart Sourcing on QuantiMetrics' downloads page.
Robb referenced a presentation given by CSC's David Moschella at the November 2004 conference of the Strategic Outsourcing Special Interest Group.
You can download David Moschella's presentation from the conference page after (free) registration.
||Thursday 6 October
|State transition testing
I have a confession to make: if you'd asked me to rank this year's sessions in order of interest - a session on testing would
have come near the bottom. But that was before I'd heard Peter Quentin's fascinating talk.
Peter set the scene by agreeing with the audience that, since the combinations of variables to test in a complex system is
nearly always infinite, we need a methodical way to test to an agreed level of coverage, an agreed depth that the testing
should be carried out to.
He introduced the concept of state transition using the examples of, first, a light switch which could be on or off, and then
a simple digital clock with buttons to switch to display time, change time and change date modes.
He explained three different levels of coverage, "0-switch" or "Branch Coverage" - testing each state transition, "1-switch"
or "Switch Coverage" - testing each state transition pair and "n-switch" or "Boundary Interior Coverage" - testing each loop
of state transitions 'n' times.
He explained since exhaustive testing was impossible it would be up to the business user or the client to determine which
level of coverage would be sufficient and that in practice, due to financial constraints, testing often only achieved 0-switch
or branch coverage.
In a departure from the normal lecture-only style of meeting, Peter handed round workbooks and set the audience the task of
constructing testing trees for, first, a simple electronic toothbrush and then a cassette player.
In questions after the session, we examined the difference between "black box" testing, testing without reference to the source
code and "white box" testing where the code would be examined to make sure that certain paths are tested. It was agreed that
state transition testing was a black box technique and that a state transition diagram is an abstraction of the system created
for the purposes of testing.
Some developers present claimed that they could write code containing errors that would only manifest themselves after, say,
10 iterations of a loop, and that Peter's method would not find such bugs. To which the answer might be, if you do that deliberately,
you'll soon be looking at your P45! In fact, 10-switch coverage which exercised each loop of state transitions 10 times, would
find such errors.
This highlighted the fact that exhaustive testing is impossible and that techniques such as state transition testing exist
to try to get some acceptable level of coverage. But that level of coverage is always compromised by the time available to
carry out testing. If it was known that a loop may exhibit a fault only after 10 iterations then the tester would no doubt
use testing time to carry out 10-switch coverage on that part of the system. If it was not known then indeed this fault may
very well go unnoticed until some time after the system was implemented.
Such is the nature of testing.
The lively discussion went on until the lecture theatre had to be vacated, and then continued in the Lamb and Flag.
A copy of Peter's slides and the workbook is on our downloads page.
||Thursday 10 November
||As he promised, Alan Lenton has posted his lecture notes on the ibgames website.
|Open Source - Turn on the LAMP
a challenge for a speaker! We were fortunate enough in our November
meeting to hold a joint meeting with, be able to welcome members of,
Open Source Specialist Group (OSSG).
Our speaker for the evening, Alan
CTO and ibgames
Game Designer, did an admirable job of spanning the broad range of
interests and experience with Open Source in the (full) room with a
stimulating, insightful and thought provoking talk.
were welcomed with the traditional cuppa and a bunch of free goodies
distro CD of Fedora Core4 with a Fedora t-shirt for all, a copy
of Linux User &
Developer magazine , a Linux
Magazine archive DVD with over 4,000 pages of Linux articles and
the coveted BCS Oxfordshire branch mug and coaster. (I am sipping my
morning cuppa from mine as we speak :)
I introduced Alan with a recognition of the challenge in the breadth of
audience expectations and industry coverage that he hoped to cover.
To set the scene some recent milestones in the progress made by open
source illustrated by some recent industry reports, such as 1 in 6
web surfers in the USA now do so through Firefox
as their browser, the spread of Linux beyond PCs to being
embedded in consumer routers and Wi-Fi Access Points (such as the
the announcement of the Open
Invention Network (OIN – free registration required) and
the news report that a private equity firm is acquiring Ingres from
Computer Associates. This last snippet led nicely into the spirit of
Alan's talk. As a Sociology graduate Alan has a particular interest
in the social interaction with technology communities. Software
businesses that aspire to implement open source strategies
effectively need to have a good sense of how to work well with open
source communities. In many cases, how one works well with an open
source community can be in direct contradiction to what one has
learned in 20 years of managing an enterprise software business. It
will be interesting to see whether or not Ingres under the
leadership of a private equity firm will be able to build a strong
community around their product.
Alan kicked off with an overview of the LAMP stack (Linux,
and the 'P' programming facilities offered by a choice of facilities
spanned by the likes of Perl,
He used this foundation to warm up to the main focus of his talk -
aspects of developing with, and using, Open Source Software. The talk
built on a keynote talk Alan first gave at the ACCU Spring
Conference. [The ACCU web site is at http://www.accu.org/]
talk was aimed at answering the question for developers in a business
context wanting to know more about open source, what is this about,
and who is it for. The goal was to help programmers and consultants
come up with a considered reply when asked whether their company
should embrace open source software, and how they should do so. The
objective was not to proselytise open source (as Alan said, if that
is what you are after just track http://www.slashdot.org,
for a few days :). The evening built an open source "tripos"
of three legs:
- What open source is
- What open source is good for, and what it is not so good for
- Looking after open source development teams
looking at three legs of open source as it currently exists from a
different perspective :
- open source as a development and applications platform
- open source as a method of licensing software
- open source as a method of developing software
the evening looked at each of these components in turn.
LAMP – covering the spread of
Linux options from the various BSDs,
thorough the host of popular distros
such as Ubuntu, Fedora, SuSE Mandrake, Debian, Knoppix etc. and a
number of hardened variants designed for security – such as the
NSA secure variant of Linux.
LAMP – building on Apache
as a ubiquitous and function-rich web server - the most widely
deployed web server on the Internet and its legendary reliability
despite its complexity.
LAMP – standing for MySQL, an open source SQL database, and
other open source offerings such as BerkeleyDB,
an embedded open source database.
LAMP – The 'P' is the
scripting language used to bind everything together, and the
transition to the developer focus of the talk - taking a quick
stroll past Perl., PHP, Python and other options such as Ruby,
Shell Script and even Rexx.
As well as addressing GNU compilers and libraries, the talk went on
to talks through the various distro choices that face the developer
and offering some words of advice.
Open source as a method of licensing software – the
second leg of our "tripos" with a prácis for
developers on some of the ramifications of some flavours of different
'open source' licences, categorised into two groups - GNU's
General Public Licence (GPL) and all the rest of the licences.
Open source as a method of program development – with
the foundation established the main body of the talk (and clear
evidence of Alan's passion for the topic) - open source as
net-enabled, collaborative, incremental development work. Much touted
as a new method of working, Alan positioned this as being in fact
merely an updated version of a time-honoured way in which craftsmen
and artisans have dealt with the alienation of their work.
Open source as a business method – for
commercial entities, a killer question - what business model,
incorporating open source software, will enable them to make money?
Open Source strengths – from the production of a rick tapestry
of excellent development tools as a stepping-stone to developing
Open source weaknesses – which looked at weaknesses of open
source lie in three fields:
- Software Patents, Copyrighting and Copylefting
Building an open source project – Alan wrapped-up by looking at how
do you build, look after and motivate your developer community,
including looking after the developers. He closed stating his belief
that open source development is one of the best methods of producing
high quality software, and as a reliable platform on which to build
your applications. But, and it's a big but, you must know what you
are doing to use it effectively.
Question time - A rich discussion followed driven by audience questions
spanning topics as varied as what can be done to promote broader
uptake of open source in the mass market, beyond business, to raising
the question on wheat and how the principles and spirit of open
source software could be applied to the broader world we live in –
open source economics, medicine, creative arts etc.
Having called time shortly before 9 PM the level of interest was evidenced
by the number of people who went across to the Lamb & Flag pub
after the talk was over and continued conversations and debate
fuelled by hospitality of the house!
My thanks to Alan for leading a splendid evening. As I drive home later
that night and mulled over the evening's conversation it occurred to
me, with Alan's balance of technology and focus on sociological
aspects of open source, that I could sum up the spirit of the evening
in that wonderful African word "ubuntu". This is for two
- Firstly, looking at ubuntu is a South African ethic or ideology focusing on
people's allegiances and relations with each other. The word comes
from the Zulu and Xhosa languages, and is seen as one of the
founding principles of the new South Africa. A succinct translation
in English could be "humanity towards others" or more
fully "the belief in a universal bond of sharing that connects
all humanity." Maybe a better definition was offered by
Archbishop Desmond Tutu: "A person with ubuntu is open and
available to others, affirming of others, does not feel threatened
that others are able and good, for he or she has a proper
self-assurance that comes from knowing that he or she belongs in a
greater whole and is diminished when others are humiliated or
diminished, when others are tortured or oppressed." I would
need to look a long time to find a better summation of the potential
of the open source movement.
- The second reflection of many of Alan's points can by found in the Linux
(and I am letting some enthusiasm and prejudice show here!) which
strives to bring the spirit of ubuntu to the software world. From an
'average end user' (as opposed to techy) perspective it offers a
splendid balance between 'out of the box' usability on most hardware
I have tried it on. It is not overburdened with complexity whilst
still embracing 'best of breed' facilities from other open source
distros as a foundation for a range of initiatives from a KDE
variant (kubuntu) to Education-specific
I pulled into the drive I was thinking, if we now have a distro for
the masses, about one of the inhibitors to broader Open Source uptake
- distribution. Access to the global Internet is too expensive, too
difficult or just plain impossible for many people across the globe.
Even for people who do have access, it can be prohibitively
expensive compared to the ease of access we in the UK have to
initiatives like Freedom
Toaster open-source software kiosks making it possible for people
to buy a single DVD-R and to burn Free and Open Source software of
their choice, are we on the brink of seeing the spirit of ubuntu
having far-reaching impact across the globe.....
success of Alan's talk was not just the content and delivery –
the power of our branch evenings is the thought-provoking nature of
the topics and helping generate awareness, momentum and contributions
to our industry. I am certainly glad I was there – thank you
||Tuesday 29 November
|You Can't Get There From Here...How computability affects the issues of computer evidence
We were delighted to welcome a large, well informed audience to the 2005 Christmas lecture.
Following on from a highly popular and entertaining talk on Computer Forensics in September 2004, Neil Barrett explained how
computer-based evidence generally has to be taken together with evidence from other sources in order to obtain prosecutions.
He traced some of the issues back to Alan Turing's work on provability and managed to slip in a discreet plug for his new
book, due to be published in April 2006.
Through a discussion on information security, Neil demonstrated that without effective measures to establish Confidentiality,
Integrity and Availability (often known by the abbreviation CIA), we cannot determine who is and who is not 'authorised' to
access information. Firstly, we need to identify the person and allow the computer system to recognise them. He introduced
three types of authentication used to confirm identity:
- Type 1 - something they know e.g. a password;
- Type 2 - something they have e.g. a token;
- Type 3 - Something they are e.g. biometrics.
One or two-factor authentication will use one or two of these types. However, just to demonstrate that this was not an exact
science, Neil used the example of CHIP and PIN payment cards, where the banks seem to have moved from two-factor authentication
based on Type 2 (the card itself) and 3 (the signature) to Type 2 (the card itself) and 1 (the PIN) - generally considered
Equally important were Authorisation - what data was the person allowed to access, administered by some kind of controlling
data structure - and non-repudiation - clean, reliable and unalterable records of "who did what and when to what piece of
data" which may involve digital signatures, reliable storage etc. Using various examples and cases, Neil sought to demonstrate
that all computer crime, to some extent, involves exceeding authorised access (presuming, of course, that that the authorised
access has been suitably well-defined!).
Moving on to evidential issues, Neil said that computers record the actions of authenticated users in terms of the access
granted to processes based on the authorisation data structure. However, processes cannot be prosecuted, only individuals
can stand in the dock! This means that the auditing data must be comprehensive, complete, clear and capable of preservation.
But we must also be able to analyse the operation of the program elements to show how the rules relating to authentication
have been applied. This brought him on to Turing's work on the 'Halting Problem' which showed that it was mathematically impossible
to determine a program's actions in advance - it had to be run to determine the actions. This led Neil to contend that information
security could not be algorithmically determined. This means that the aim of information security is to make the task of exceeding
authorised access as difficult as possible, to determine what a user has done and to persuade them not to do it because we
will be able to detect it.
After the talk, the lively discussion continued at the Lamb and Flag over mince pies and mulled wine.
A copy of Neil's slides is on our downloads page.
||Thursday 19 January
|VoIP and IP Telephony
Peter Gradwell opened his talk with an introduction to the concept of IT telephony, briefly describing the best known players in the consumer
market, including Skype, Vonage, freetalk from Dixons, Wanadoo, BT Communicator, and, announced that very morning, the launch
of a new service from Tesco.
He described the hardware options:
- for consumers, either a headset or dedicated USB phone that plugs into a PC
- for businesses and more advanced users, either a special ethernet-connected IP phone, or a normal phone plugged into an APA
(Analogue Phone Adapter) device connected to the network.
Internet rules apply - distance isn't an issue
In principle, you can plug in anywhere and use VoIP.
In the UK, Ofcom had been persuaded to allow VoIP operators to assign geographic telephone numbers.
This means that, for example, a Londoner can have an Edinburgh phone number, and you can work from your spanish villa, whilst
appearing to be in Birmingham.
Users in other countries are not as fortunate: German VoIP customers cannot have geographic telephone numbers.
Peter explained that while
- individual VoIP users may be able to benefit from somewhat cheaper calls (depending on their call pattern and discounts offered
by their telephone service provider), and
- large enterprises probably already have sophisticated PABX systems and negotiated discounts from telcos,
- Small businesses probably stand to make the easiest and most effective gains from VoIP.
Using examples from his company's customers, Peter explained how a small business, with individuals in different offices
or home locations, can appear "joined-up". Internal calls are free, incoming calls can easily be routed between extensions
if the first user does not pick up, or routed to several extensions at once for the first available user to answer. By the
suitable use of geographic numbers, the business can appear to be local in several different areas.
Peter offered to supply more information: just sign up for ten things to consider about voip.
Ater the formal session and Q&A, members had a chance to look at some of the IP telephony equipment which Peter had brought
with him, before repairing to the Lamb and Flag to continue the discussion.
Peter's slides are here and also linked from our downloads page.
||Thursday 23 February
Nigel Shadbolt seen here during his lively and very well accepted lecture on Web Intelligence.
Nigel started by reminding us of the continuing exponential growth in processing power (Moore's law has held for four decades),
and the facts that the web now has of the order of 108 users and 1010 pages of information.
Nearly all of the pages on the web are designed to be viewed by humans, who, without any particular training, can easily scan
a page about a conference, for example, and pick out details such as event timing and speakers.
Such unstructured information is much harder for a computer to process accurately, although because of the vast quantity of
information available, we would very much like to be able to sift it automatically.
Nigel reviewed the use of metadata for classifying information. While this is an advance over raw HTML, it requires agreement
on the meaning of tags and is hard to extend. Hence the concept of the Semantic Web and the use of an ontology language by
means of which the meaning of tags can be defined. One practical example of the use of ontologies is in Gene Ontology.
Nigel described the work of Advanced Knowledge Technologies, one of the IRCs (Interdisciplinary Research Collaborations) funded by the EPSRC and the MRC, and in particular the CS Aktive Space project, winner of the 2003 International Semantic Web Challenge. This continually harvests data from research papers and
similar sources and is able to infer who has been collaborating with whom and to create Dynamic Communities of Practice. A
practical application of the technology has been in the field of breast cancer screening, where by integrating information
from disparate sources such as X-ray, MRI and ultrasound imaging, medical notes, case histories, histopathology etc, a more
rapid and accurate diagnosis can be obtained.
He ended by showing us how the AKT visual tools could be used to focus on emerging technologies, again by automatically analysing
patterns of publication topics.
Questions flowed after the talk and, when we had to leave the lecture theatre, continued over some liquid refreshment in the
Lamb and Flag.
Nigel's slides are here and also linked from our downloads page.
For further reading on Ontologies and the Semantic Web, see Conrad Taylor's excellent account of the Roger Needham Lecture given by Professor Ian Horrocks on 7 December 2005.
||Thursday 23 March 2006
At the morning session, our Blue Badge Guide, Geoff Marshall, put the history of computing into the context of the history
of science and technology. He made it very real for us by telling us something about the personalities involved. The fundamental
science of Newton led to a primitive but useful steam pump built by a blacksmith. A later development, now pressurised, powered
1,700 looms in a cotton mill. We saw a loom controlled by punched cards, an idea adapted to program early computers. We think
of the speed of change being particular to our era but the difference between the Puffing Billy and the 1868 locomotive is
less than a decade but a vast difference in design, style, reliability and implementation of the technology. There is a similar
story comparing the Pegasus built in the mid 1950s, one of the first commercial computers based on the Turing architecture,
with its valves and trailing wires and the tight wiring complexity of the Cray computer from the early 1970s. The faster Cray
concept recognised the speed of light as being a limitation on speed, so shorter connections greater speed. Our visit was
too brief and we missed the firing up of Pegasus which takes place on Thursday afternoons. There are a couple of photographs
here as a reminder of some of the fascinating exhibits we saw, or for others to see something of what they missed. We can
let you have contact details for our guide Geoff Marshall and the other tours which he runs.
Our afternoon visit to the archive at Blythe House was also a time constraint torture. Our Curator of Computing, Tilly Blyth,
was assisted by the Curator of Communications, John Liffen, on our tour. We went through the Communications Archive to reach
the Computing Archive and John gave us a brief tour of this too. We saw the BBC's first radio transmitter, the 2L0, which
was superseded within four years. We didn't dare ask too many questions in case we didn't get to Tilly's patch. In the Computing
Archive we saw the first laptop that would break your lap not alone your back which had a screen about twice the size of one
on a mobile phone today. Again put into context by Tilly's story of the people involved. The inventor of this laptop was better
at inventions than at marketing and blabbed about the better model in the pipeline so no one bought the first, he went bust.
The story we learned in the morning continued, useful early machines that did real work: that could store information, process
conditional statements and input and output, were then quickly refined and improved. Tilly set us a tease, she pointed to
something that looked like a bath towel rail for us to guess what it was. It was a small delay line and was similar to storage
used in 'Leo' the famous J Lyons & Co.'s computer. Convergence was also a theme of the day; we are not just building on past
achievements in computer technology but basing advancements on the the different materials and techniques that become available.
This was a well titled tour 'From Valves to Chips'.
The photos from the Science Museum showing Babbage's Difference Engine and the Jacquard loom are reproduced by kind permission
of Keith Diment. For more pictures visit Photo Album Number 36 at www.diment.org.uk.
Photos could not be taken at Blythe House as it is not suitable on a tour such as ours but arrangements can be made for anyone
wishing to do some research.
||Thursday 27 April
|Computer Games Development: Past, Present and Future
Simon Prytherch, Head of development for Empire Interactive, gave us the story of games development in the past 20 years. He had used his in depth knowledge of the industry to pick
out seminal developments in the games and the hardware platforms to discuss with us. This short roller coaster ride covering
the highlights began with Simon showing us one his own personal favourites, his original development platform for the Atari
He discussed the major developments over the last few years currently culminating with the Xbox 360.
He explained the breakdown of work effort that goes into developing a major game costing many millions. The development is
very much like producing a film with specialist sound engineers and graphic artists, specialist writers and producers all
working along side the traditional programmers. One recent big development is to write procedural algorithms to draw the graphics
in real time rather than the very much more time consuming process used previously where the graphic artist realised each
object which was then animated where necessary.
Much of the equipment Simon used to demonstrate the ideas was specialist development versions of the standard hardware. We
were told the Xbox 360 contains six incredibly fast processors running in parallel and he showed us the full power of the
graphics engine using some development demonstration software. This displayed a torus covered in very fine fur all animated
in real time as he rotated the object under his control. The effect was photo realistic and needs to be seen to fully appreciate
the massive power of the Xbox processor architecture.
Important current developments include the online gaming community built and supported by both Microsoft and Sony each having
their own network. Microsoft in particular is developing small add-ons and options for their games which may be purchased
on line for small payments as they believe this will build into an important revenue stream for the industry. The small payments
are made using a "micro payment" mechanism specially set up to support the small transactions necessary (sometimes in pence).
Future developments discussed included the new Sony PS3 soon to be released and the new Nintendo device possibly to be called
Wii (exclusively revealed in the lecture, remember this was one of the places you heard it first!). The PS3 and the Wii are
both even move powerful again than the Xbox 360..what will be possible on these platforms?????!
Simon thought that we were now fast approaching the point where games are so realistic they are quite literally just another
form of entertainment alongside and rivalling television for a place in the living room. He believed soon you would be just
as likely to relax by picking up your games machine controller and pitting your wits against the rest of the family or friends
and relations networked into your home in a multiplayer game as watching TV. Films, music, TV media and games are all merging
into potentially cross fertilised media entertainment experiences.
Another interesting development which is appearing on the horizon is "games for free". Such games would include advertisements
that appear as part of the game play and this would fund the cost of the game being developed.
||Thursday 25 May
Branch chairman Sheila Lloyd-Lyons opened the AGM by welcoming members to the meeting and thanking the committee for their
support during the year. Sheila, who had taken over as branch chairman part way through the year when her predecessor, Robert
Ward, found it too much to combine the role of chairman with the demands of new fatherhood and the day job. Robert continues
as a member of the branch committee and will continue to sit on the national BCS Branch Management Committee.
Sheila referred members to her letter, copies of which were available. The letter reviewed the past year's programme and looked
forward to the programme for 2006/7, preparation for which is well under way.
Branch Treasurer, Tony Cox, reviewed the branch's finances which, thanks to his careful management, allow us to deliver a
high quality programme to our members.
|The Semantic Web: too clever for its own good?
In a fascinating talk, Dan Zambonini started by reminding us about the Semantic Web and how it developed. He described metadata,
RDF schemas and triples, OWL, and reviewed some of the 100+ Semantic Web applications which exist today: Edutella, MusicBrainz,
Annotea, foafnaut, the Yahoo! Creative Commons Search, Haystack and PiggyBank.
He then suggested that interest in the Semantic Web was declining, or had at least levelled out. This was supported by stats
from google.com/trends which showed a lower level of hits for "Semantic Web" than, for example, Fortran. Amazon's book list
shows far more books on the latest hot topic, AJAX, than RDF.
He surmised that this could be because search engines such as Google have made it very easy to find information on the web
without authors having to classify their pages using what may appear to be complex RDF schemas. He quoted a respected blogger
who had written "I get the feeling that in trying to achieve the ontological purity needed for the Semantic Web, it's starting
to leave the desperate hacker behind".
He concluded by saying that if you want to participate in the Semantic Web, you should expose your data as RDF. The basic
technologies are ready, the potential is huge, but you may not see any effect, not for a while, anyway.
Dan's slides are here and also linked from our downloads page.