Tuesday, August 12, 2008

Prof. Randy Pausch

I learnt yesterday that Prof. Randy Pausch of Carnegie Mellon University lost his fight with cancer and passed away last month. He will be a big loss to CMU and to the computing world.

If you are looking for some inspiration, I recommend watching his last lecture - click here

Saturday, August 9, 2008

Java 1.6, Eclipse and my new mac

Wow, I had an interesting time trying to get my eclipse plugin, developed in linux using Java 1.6 to work on my OS/X JVM which is a 64 bit JVM. I was getting all sort of problem to do with the SWT libraries java.lang.UnsatisfiedLinkError: Cannot load 32-bit SWT libraries on 64-bit JVM. I found a few work-arounds for this using google, but seemed very complicated and as I am a mac newbie - i thought i better not.

So I decided to change the plugin to 1.5 - so I set the runtime environment to use a 1.5 version JVM, but then I was gettng problems with the version number of the compiled classes"bad version number in .class file". So I changed the build environment to a 1.5 JVM. I thought this would fix it - but no. So then i done a search and found a way to rebuild the project (project -> clean..). But still no joy. So then I removed the classes files manually - thinking the newly build files with the JVM 1.5 werent getting written because the 1.6 version were still there, but no. So after about an hour I figured out what was going wrong - there was still something in my project set to 1.6 - the java compiler settings for the project!!!?!

Thursday, August 7, 2008

My 10 mile race

So I will be running a 10 mile race on Saturday week, as part of the adidas Dublin Race Series 2008 . I done a 5 mile there a few weeks ago and done it in 45 mins, which wasnt too bad for a first go. I am going to really challenge myself and try and do the 10 mile under 1hr 25mins, which will be pretty tough! After that is the half marthon, and after that is the marathon - I am not sure about doing the marathon, the time committment in terms of training might be just that too much, particuarly as I finish my PhD.

Authoring of Adaptive and Adaptable Hypermedia (A3H) Workshop

Attended and presented at the A3H workshop in Hannover last week, have been meaning to outline the talks that were there so here goes:

First of all the link for the workshop is here

The first talk was by Fawaz Ghali, and the work he has been doing in Warwick trying to connect the social web in the form of folksonomies with the semantic web in particular ontologies. Advantages of this would include semantic relations between folksonomy tags, enabling of reasoning on the social web and augmenting the authoring process of adaptive hypermedia by providing rich free heirarchically structured data. Ghali process consists of three main phases, firstly mispelt tags are filtered, tags are grouped according to co-occourance value, and finally the groups are mapped to matching elements of ontologies.

Maurice Hendrix then presented his paper which outlines how a meta level can be added to the LAG language which allows for a fuzziness of values in the LAG rules. This solves the following limitations identified by Hendrix:
-adaptation is on a per concept basis
- adaptation engine does not allow for non-instansiated variables
- cant combine multiple strategies.
I understood how you could combine multiple strategies with this approach, as you dont have to strictly specify a leaners model value so beg_engineers and beg_cs can be treated as beg students, but didnt really get how the other limitation are solved

Sergio Gutierrez looked at the formalization of exercises, which reminded of the work we did on the ontaware project looking at the same thing which we published at SWEL workshop at AIED05. This work is based on a parameter approach and is interesting, as it shows how many questions can be asked of the same root question.

The work of Ortigosa et al. looked at what was the least amount of questions you need to ask in order to understand the learner enough to provide adaptation

Another paper from Warwick looked at the use of collaborative authoring in the development of AEH for the MOT tool. To do this the CAF was extended to include collaborative elements, such as tags and opinions. This allowed for feedback on material in AEH - which could then be used in the adaptive process.

Eric Ras presented a really interesting paper which outlined the use of semantic wikis to capture knowledge, which are then used as the basis of the creation of learning spaces.

After the papers were presented there was a discussion panel, where interesting points were raised about the lowering of the barriers for authoring in developing AH. Most people agreed that the cost-benefit ration has to be addressed, in that we must outline how beneficial it is for an author to create AH and how we can lower the cost by providing better authoring tools. This discussion made me think that the application of AEH in the academic scenerio (the most common test bed) perhaps doesnt make sense anymore. Academics do not have the time to spend on authoring and do not see much in the way of recognition when they do develop good AEH. I think perhaps the focus of the AEH community should be on the corporate sector as this sector spends a massive amount of money in getting employees up to speed at the start of their career and also during their career with a particular company. This represents not just a massive cost in terms of the salaries paid to the employee and to the trainer, but also in terms of opportunity cost, lost while the employee is training. Anything to cut the time employees spend training will represent a massive cost saving for business. This is where we should be aiming our AEH apps as the cost - benefit ratio is significantly more in our favour here. Then when we have matured our authoring tools here, perhaps then we could start to look at the academic scenerio which will be much harder to break into.

Overall a great workshop, where real debate took place about the future of authoring in this area.

Friday, July 25, 2008

My new Macbook

Well I have made the change, i got my first mac there, I have always used a PC up to now, normally with some flavour of linux or windows on it, so there was always going to be a bit of a learning curve. Well i was pleasently surprised with how little of a learning curve there actually was. Within a few hours i was able to find my way around the various menus of OS/X. I am still by no means an expert but getting to a stage where i am comfortable with the OS.

The biggest aspect of the mac to get used to is the difference in keyboards on my new macbook. the buttons appear to be in a slightly different position, and sometimes typing the letter doesnt appear. I think this might be something to do with some setting, perhaps.

Another slightly annoying aspect is the jagged edges of the computer - this makes it look good but does nothing for your wrists - cuts the arm off you.

Anyway, that is mac week one down - now for some serrious work on it, as i have a conference in Hannover.

Saturday, July 5, 2008

ICALT - Day 3

Today I was principally concerned with the workshop I was to present at - Crafting didactic materials based on IMS LD: From Requirements to Evaluation. There were to be six presentations in this workshop, but thanks to the one of the organisers I was able to give a quick outline of my phd work at the end.

Shelia MacNeill from CETIS outlined interoperability between LD and the rest of the course development scene. She also outlined four levels of granularity that must be addressed - LOs, ??, Activity and Course. Then work on the sucessor to Reload, Recourse was outlined by David Griffiths from the University of Bolton. It is interesting to note that that team has now moved to use GMF a technology we use extensively in our project. He also mentioned services in LD and how badly they are defined. David showed how widgets could be used as learning services, from a central widget server.

Davinia Hernández-Leo outlined a way of raising the level of abraction when using complicated specifications such as LD, using educational patterns. I am not very sure how this works exactly, but will have to look into this in more detail as the idea facinates me. Carmen Padron then outlined how evaluation cycles can be done on LD courseware and how this can be specified on the courseware using the runtime adaptation approach developed by Telmo Zarraonandía, the co-author of the workshop paper. This lead nicely on to Abelardo Pardo' s paper on how he has devised a way for the manipulation of LD design during delivery time by manipulating property values. He states he is uncomfortable with the theatre analogy used in LD, as a script does not allow for the reactionary changes needed in courseware. If you originally design your courseware to consider the manipulation of property values at delivery time this is a very powerful tool. Finally Jekaterina Bule from the Latvia outlined the evaluation of eLearning in Riga.

After the workshop I attended an interesting session entitled "Content Authoring Technologies III". Two speakers particularly took my attention due to my PhD work on modeling courseware, firstly by Ivan Martinez-Ortiz, who described the work he is dong in developing an EML authoring notation. He discussed Flow oriented authoring notation, there was some discussion afterwards about the suitability of this type of concrete notation due to the variability in courseware at delivery time. Other concerns were that this notation was too geared towards computer scientists, but Ivan is looking at testing his notation with non-computer science users, and I think his results from this will be very interesting.

Dirk Frosch-Wilke then presented his paper entitled, "Evolutionary Design of Collaborative Learning Processes through Reflective Petri Nets". This paper showed an approach based on coloured petrinets could be used to provide a snapshot of a learning process. To do this there was two levels a metalevel PN and a base level PN - sort of similar to metamodeling I think. Petri Nets mean verification can be carried out on the courseware, I asked does this mean a sort of model checking approach to the courseware but I was wrong here. This approach is used a lot in business process analysis, I think the use of petri nets and how they are used by Dirk and his team is something I want to get my head around - very interesting work.

ICALT - Day Two

For day two of the conference there was a big change to the weather - it was like a bad day in Dublin, with temp of about 14 degrees and rainy.

The highlight of today was the panel session, which discussed a competency approach to elearning - it looked at what a competency was and how we could use competencies as the primary focus of learning. This talk made me think of competency and how I defined this word, as I had always thought that competencies and knowledge are very similar, but this lead me to believe that a competency is the capacity of a person to demonstrate some skill to some level. Where knowledge does not necessarily mean there is a skill involved. A competency is been able to assess if someone can do a job.

One interesting point by a UK collegue was that there needs to be some sort of comparable framework for eLearning systems. He basically has enough of all these technologies been thrown out at conferences without the mentioning of the underlying paradigm, framework and model - and why the system was designed the way it was. I found this very interesting and intend to contact this person for further discussion