What will a university web team look like in future?
Normally I would never write a blog post stating my views on the future of anything as I've found them to be usually inaccurate, and after some time has passed, often humorously so. However, in this case I'm going to give it a go because in this case I would be love to be proved completely wrong.
So what do I think the future holds?
Well let's start with the infrastructure. I think all web servers and associated architecture will be hosted off site as web content management will be provided as a Software as a Service. This will either be from a commercial provider or by a centralised higher education organisation along the lines of JANET, but not necessarily that particular organisation.
As such the functionality of the content management software will be provided 'out-of-the-box' but offered as a tiered service with payments to reflect what you want to use. Any new functionality will be commissioned from the service provider perhaps with a reduced rate for any developments that can be incorporated into the software releases and therefore shared amongst the community.
So, that doesn't leave much left of the remit of the web developers. I'm sure there will be a requirement for some customisation of the web but that will either be developed by the software provider (at cost) or developed in partnership with them by a small internal application support team, whose remit will be to look after / customise the University's core applications (finance and hr systems etc).
The website will be maintained by a network of web editors (of various skill levels) across the organisation with the task being an integral part of many people's roles, so much so that the title 'Web Editor' may even disappear. Training, both technical and best practice, will be part of the centralised staff development programme, treated pretty much as a core IT skill. The day-to-day support for people editing the web will be provided by the IT Support services inline with all their other IT systems. Though as publishing to the web becomes more and more common in people's work and personal lives this will move from the remit of second line to first line support.
Any remnants of a web design remit will reside within the responsibility for Marketing or Communications departments, who are ultimately the one's with the vested interest in brand consistency and design. It may even go so far as to be incorporated into the role of the graphic designers within the print departments and again won't be a specific 'web' role.
So that's the role of the web designer, web developer and trainer predicted - so what about the web manager. Well, I'm sorry to say I think this will disappear as a separate role as well. I think it is likely that elements will be taken from it and given to other roles. I can see the responsibility for the strategic view of content will be given to Marketing or Communications manager. The management overview for each website / online service will sit with the manager of that service - after all a website is just a communication platform in the same way that service brochures and mailing lists are. Any technical responsibility will sit with some kind of an IT Applications Manager, as just another core IT application with supplier/customer partnerships to manage.
So there you are - I'm predicting the death of the web professional as a specific role, which is a really scary thought. However, my reasons for this are not merely driven by the sector's need to cut costs. It is also driven by the fact that the web is becoming business as usual and as a result, I think the need for 'specialists' will disappear.
Anyway, I'm keeping my fingers crossed that this vision of the future is completely wrong!
18 November 2010
What will a university web team look like in future?
17 November 2010
What skills do you really need in-house to support your website?
When I first got involved in the web industry (more than a decade ago), websites were definitely the domain of a developer or web master practising the dark art of coding. Soon web skills became a spectrum stretching from design to development and at the time my own skills placed me somewhere in the middle of that spectrum.
Of course the web has changed vastly over that decade, so in the era of blogs, wikis and content management systems is it the time to review the skills you need in your University web team?
This question came to mind after my experience managing my team through a distinct change in their roles once our content management system was rolled out. Let me explain why by briefly outlining the what the team ended up doing for the organisation.
There was still the need to write code, but not as much as there used to be. Templates still needed developing and amending; they embedded the latest social media gadgets and brought content in from other systems. They set up sites with blank pages based on agreed information architecture for the editors to fill with content.
Of course there was still design work being done. Again templates were refreshed as was the design of the site, to keep it 'fresh' but mainly in response to the user acceptance testing we did. They also helped out with website banners and html email templates.
So what did they do with the time freed up? Well they supported the 200 plus web editors across the organisation. They fixed the pages the web editors broke - though sometimes they wondered how on earth they had managed to break them so completely in a locked down CMS. They ran training courses sharing best practice and making sure that the web editors understood what the templates did and how to use them effectively. They were also at the end of the phone, gently reminding the web editors how to use the CMS when they had forgotten. Basically, they supported, monitored and perhaps even controlled the web editors and through them the quality of the site.
So, I'll throw a three questions out there - is that the traditional skill set of a web designer or a web developer? Does it also encompass the skill set of a trainer? Or is this the new skill set of an all-round web professional which is an amalgam of the above.
Actually, I've not got an answer to all of these questions - I'm still reflecting on this one.
15 November 2010
Does a University need its own technical developers?
This is probably the most controversial post as it is probably suggesting redundancy for people reading this and I'm an optimist so I don't normally like spreading doom and gloom for the sake of it. However, the accountants in your University, College or other organisation may well be asking the same question.
The core business of a University is educating its students and doing whatever it can to generate income to keep itself afloat (consultancy etc). Unfortunately for everyone else at a University, if you don't have a direct impact on that part of the business then I think you may be in trouble with the cuts to come.
As I've mentioned in my previous post since the University I worked for used a content management system (CMS) which didn't have a decent API most of the development was already outsourced. So was this an issue? Not really, not apart from the usual issues of outsourcing (their availability because of other projects or their working hours). Both of which can be sorted out with a proper maintenance contract and service level agreement.
What was an issue for us was it not being an enterprise CMS that could cope well with the amount of content and editors we had. It also didn't have the underlying architecture to cope with multiple servers like an enterprise CMS. Ultimately that is what the University needed - whether they'll go ahead and complete the enterprise CMS procurement process I started while I worked there, will be interesting to observe.
Anyway, back to my point. It seems to me that if a CMS provides the majority of the functionality that the organisation requires, they have two options. Firstly they could wait for the new release of their CMS software which may or may not have what they want included. Or they could just commission the CMS company to develop the functionality for them, as and when they require it. Both options could remove the need for developers to be on their payroll.
What it won't remove is the need to have people skilled in using the CMS, as in my experience most of the University web editors do not have the knowledge to use the CMS beyond the basics of adding, editing and deleting content. However, these skilled CMS people don't necessarily need to be highly skilled (and therefore expensive) developers.
I guess another option would be along the lines of shared services where either formally or informally organisations pool their development knowledge and resources to cut costs. However, this still could mean reducing the numbers of developers on the payroll.
As an ex-developer myself I really hope this isn't the case and the sector values the skills they have in their in-house development teams. I hope they are also prepared to fund the teams in a manner that delivers a high standard of service and support for what is in my opinion, the best communication channel available to them.
However, there is this nagging doubt in the back of my mind that this optimistic view isn't realistic and hard times are ahead for developers.
Does a University need its own CMS?
So as I see it now there are lots of Universities and Colleges in the UK each with their own content management systems (CMS) running similar content and similar functionality (often using the same CMS). If I look at this with my 'business head' on it seems to show a lot of duplication.
Yes I know, Universities and Colleges are competing against one another - but are you really competing on the strength of your website or the functionality of your CMS? Or are you competing on the content and reputation of your courses, the future employability of your students and the level of your fees? Personally I think the latter is the case. So why are Universities not looking to share content management systems and treat them a bit more like software as a service?
As far as my understanding goes, enterprise content management systems can manage multiple domains, multiple sites, multiple styles / page layouts and lots and lots of content. So isn't it possible that two or more organisations can come together to use the same CMS, cutting down on the costs of hosting, development/maintenance and certainly cutting down the cost of the annual licence fees and support contracts.
I understand, once again this may be controversial and there may be technical reasons why this isn't possible (i.e. a critical mass of content or domains with which even an enterprise CMS may struggle).
However, isn't it something worth exploring?
Does a University need to host its own web servers on site?
In my last blog post I suggested that this is one change that may be on the way for a University web team and here is why.
From my limited experience, server rooms seem expensive to build and maintain. In the time I've worked at the University the number of servers seemed to grow exponentially. There wasn't a service or piece of software that I saw rolled out that didn't have some online element and a database somewhere behind it. For every service that came in there were also set up test, development and live environments - so for every live server there were (and still are) two more unseen but still using air conditioning, power and rack space. As a result IT teams reacted to virtualise servers making the physical hardware more efficient and cost effective. They've also extended and upgraded server rooms at considerable cost. However, is this enough to keep up with long term demand? Is this approach cost effective in the longer term? I don't think so.
So my question would be:
Why is the University hosting all its servers and services on site?
Wouldn't it be better for a University to just host the services that need to be secure, where access needs to be tightly controlled and where the data is the most confidential?
Web servers by their nature hold information that ends up in the public domain. The website is also a high profile service where both visitors and senior management expect a 100% availability rate - so resilience is key. It is also a key service in the case of disaster - if the rest of the place disappeared in a plume of smoke - the website would be an integral part of any communication with students, staff and local / national press. With all these factors it makes sense to me to host the web servers outside of the University in a resilient environment.
Obviously there are lots of commercial providers who offer hosting services that a University could turn to, but I'm not sure that is the cheapest or best option. Instead I think something more along the lines of a shared or centralised service provided by Universities would be a better bet.
There has been lots of talk of shared services in the past and I think server hosting, specifically web server hosting would be a good place to start. JANET already run a web hosting service but I'm not aware that is it being used widely for University's main sites (or even used for contingency sites).
However, I think it's certainly worth further investigation now. If their service doesn't quite match your current requirements, surely it is in JANET's interest to work with the HE sector to change that and to provide a service fit for the new 'leaner' University of the future? If not, then I can see an opportunity for the commercial sector (perhaps the CMS vendors) stepping in instead.
I've been reading with interest many discussions on what the future of the University web team will be over the coming years of austerity. In response I wrote this blog post a few months back and have now been prompted by Brian Kelly and Mark Greenfield's blog post (The Axeman Cometh Preview) to look at these again and finally post them.
I suppose at the moment I have an interesting perspective on this debate because I've managed a University web team for five years but I've recently been made redundant - so you might say I have one foot in and one foot out of the camp.
The main push for change as everyone knows is the cuts. However, its not just that - the industry is changing as well. When I started managing the web team five years ago there were a couple of developers and a couple of designers controlling the website. The web content was vastly out of date and if you navigated around the site you'd probably find pages in the last three or four brands.
That's now changed, the University I worked for have a CMS (though technically not an enterprise CMS), the content is much more up to date and there is a consistent brand. There is still a team with a couple of developers and a couple of designers controlling the website now based within the Marketing Department, but there are also 200 plus web editors who update the content they are responsible for. From my perspective that seems to be a model replicated across the University sector, in a response to rising demand but not the comparable budget to match.
So you might say from that little has changed, but I don't think that's the case. If I look objectively at what my team did, they spent the majority of their time supporting and training web editors, fixing things the web editors broke and producing banners and html emails to support the University's Marketing department. Little of their time was spent coding anymore, since they use a proprietary CMS without a decent API most of that development work is already outsourced to the CMS company. The most development they do is in setting up a skeleton site structure and maybe embedding the latest social media widgets. Now whether that is right or wrong is a whole other discussion and not one I'm going to tackle here.
So could this model and the University web team as we know it be outsourced in future? I think so, at least partially, and in a number of ways.
1) Outsource the server hosting
2) Treat your CMS more like 'software as a service'
3) Outsource or share your technical development
4) Pay for the right level of skills in-house
Although I've just listed them here, I will explain what I mean by these in a series of blog posts. I know each is potentially a contentious issue and I need more than a few lines to explain my thoughts. I'm not saying I'm right, but I think its important to discuss this now, because when your University accountants are looking to make cuts they'll be asking these same questions.
Also in a final blog post of this series I'll summarise my views on what a University web team will look like in a few years time. I'm hoping for the sake of my ex-colleagues and friends that I'll be completely wrong, but it will be interesting to see how accurate my 'Tomorrow's World' prediction is.
22 October 2010
So that's it - the cuts have been announced and we all have an idea how bad it is going to be. Unfortunately (or maybe fortunately), I've had a head start on finding out what that means for employees made redundant from the higher education sector and so far I can report that its not particularly easy finding alternative work.
With the public sector likely to be shedding jobs left, right and centre I'm looking to get back into the private sector, where I worked before I went into HE in 2004. However, so far despite 13 years experience in the industry, it's proving difficult. There are so many people applying for jobs that your lucky to hear anything back, let alone get an interview. So it started me wondering whether coming from the HE / public sector is working against me, when so many other applicants have recent commercial experience.
When I worked in the commercial sector I was aware of a perception that working in a University was easy without the same commercial pressures that the private sector have - the old 'time is money' approach. It was certainly reflected in the comments from some of my ex-colleagues when I told them where I was going to work and although they were 'joking' it did reveal an underlying 'perception' of what is is like.
That 'perception' is not what I witnessed or experienced in the past 6 years - far from it. I worked at a post-'92 university where expenditure has been tightly controlled. I often used to remark that the Finance Director had short arms and long pockets, especially when funding requests were turned down. Getting new permanent staff was like getting the proverbial 'blood out of the stone' even with the backing of a business case. However, that approach brought the institution financial security - well that was before the economic meltdown and the deep cuts to come.
So just in case anyone reading this is under the impression that anyone working at a University is a second class employee I'd like to beg to differ...
As I've said above I've worked to tight budget, I still had staff to pay and project budgets to work within - time was definitely still money. I also always had more work than I could ever deliver with the resources I had in my team - 'spare' developers was never an issue I encountered.
Also frustratingly my IT project was often not the focus of the academic community - that was quite rightly the students. So multiple projects had to be planned, put on hold, re-planned, picked up and juggled to fit in with the academic's timescales. So effective project planning is a skill that you definitely learn and learn quickly.
I dealt with awkward customers aplenty, many with unrealistic expectations of what their project budget would give them. Others didn't have a clue what they wanted or how to communicate what ideas they did have. In fact they were sometimes the easiest to deal with. So business analysis skills and being able to talk 'technical' in a non-technical language comes in really handy.
I still had to manage supplier as well as customer relationships, in fact I was pretty good at extracting the best deal from them. So negotiation skills, sales skills and account management skills become second nature. I've also had more experience running effective meetings that I ever would have done in the commercial sector. With sometimes three or four meetings a day and 12 papers emailed to you that morning for the fourth meeting you develop exceptional speed reading and analytical skills - working out what you need to know out of all that information for each meeting is definitely a skill, if not an art.
Yes, I had the luxury of a staff development budget, allowing me to attend conferences - well one a year, maybe two if I was really lucky. So it is easier to keep up to date with the latest trends in your particular field. Though that is something I love to do anyway - I am a self-confessed geek - I love technology and would keep up with technology trends whether I had the staff development budget or not.
In fact I've learnt more about managing projects that have to deal with both legacy and the latest 'buzz' technologies than I could have in the commercial sector. It was only this year that the University upgraded its PCs from IE6, so we had to develop for IE6 to the latest browsers, as well as getting twitter feeds and other social media tools to work in them. This was a challenge at the best of times, but you do it because you have to.
So what is my point? Well it seems to me that far from being second class employees, staff from the public sector have the skills the commercial sector need. In fact, I would go so far as to suggest in an economic climate where services once performed by the public sector are being outsourced to the commercial sector, it is exactly these experienced employees that the commercial companies should be snapping up.
As public sector unemployment figures are set to rise I really hope, for all our sakes, that this is the case.