A love letter to Stitcher

alone.jpg

 

Stitcher is a service that allows you to “Listen to over 40,000+ radio shows and podcasts on your iPhone, iPad, Android or PC -anytime, anywhere”. I discovered Stitcher about two week ago, and I feel in love with it immediately. Then a problem struck and Stitcher was gone for several days.

Below is a letter that describes my feelings

Dear Stitcher

Even though it was only been a short time, I’m so glad that you came into my life. You and I seem to be such  great fit. You completed me.

Every evening we would go walking together. I would listen intently as you told me stories of the world. You educated me, and introduced me to people who had their own interesting stories. Because of you, I became a new person. A person that I liked.

I loved that we always traveled together, and that you were always there for me. You knew exactly what I needed. Sometimes I needed to know what was happening n the world. Sometimes I needed advice and inspiration. And sometimes, I just needed to relax. Whatever it was, you always came through. And I loved you dearly for it.

And then, Stitcher, you were gone… You didn’t tell me that you were going, or why. I wasn’t even sure if, or when you were coming back. I felt pain. I felt loneliness. I felt lost. Oh, why did you go? Were you OK? Were you hurt? Or was there another reason that you left? I searched for answers, but there were only hints and rumors.

Dear Stitcher – you were the world to me. I didn’t know how I was going to handle this “emptiness” that I was now experiencing.

But you know what happened, Stitcher? After a short period of feeling sad, and sorry for myself, I started looking for something else that could fill that space that you left. It wasn’t easy. You were perfect. All the others failed in comparison. But I needed something that would educate, and entertain me the way you did. And I found it. It wasn’t able to give me that warm feeling that I got when I was with you, but it was able to help me from feeling so lonely.

Stitcher, I have heard from friends that they have seen you around, and that you seem to be doing OK. And I’m really happy for you.

I’m not, however, sure if I’m ready to let you back into my life. It still hurts too much.  This might change over time. I might be able to love you again, as I once did. You will always have a place in my heart, but let me go through this process. Time has a way of healing everything, and I truly hope that we will be together again one day.

With All the Love in my Heart

– Mark

 

 

 

Look Down

In a recent post (“Is being Socially Connected online really that damaging?“), I discussed a response to a video on YouTube that preached the sadness of the way people are constantly online.

I’ve just discovered another response to “Look Up”. This one is called “Look Down“.

And here’s the link to another good one:

 

“User Adoption Strategies” – Second Wave People

I finally got a chance today to start reading Michael Sampson’s book User Adoption Strategies – 2nd Ed.

I concentrated on Chapter 1. It was incredibly educational. In fact, I read it twice. In this chapter, amongst other things, Michael introduced the concept of First Wave People, and Second Wave People.

The best way of summing up the difference between these two types of people is by using a quote from Michael’s book:

A first wave person is attracted to the “what” of new technology, while second wave people focus on the “why”.

That one sentence captures it exactly. Michael also points out that these two types of people have different perceptions of reward. For the First Wave people, getting to use new tools is reward enough, but second Wave people have to understand where and how the new tools will improve their current work.

I’m looking forward to Chapter 2 tomorrow…

shiny_object

IT needs to be less “T” and more “B”

What follows is one of my post that was recently published on AIIM’s site as an “Expert Blogger”. (The original can be read here)

———————————————————————–

IT needs to be less “T” and more “B”

There is a “feeling” in the world of the Information Professional at the moment that there is too much focus on the “T” in IT. That is, the IT department focuses too much on “technology”.  Usually at the expense of what the customer – in most cases the business users – really need.

Sure, the IT department is necessary to install and maintain the technical infrastructure that is necessary to allow a business to run, but then it must not forget that it is there to serve two masters – one is the executive layer who make the decisions regarding the purchasing of the necessary infrastructure (and pay the salary of those working in the IT department), and the other, which is the user of the technology. This is because it is the users of the technology that actually add value to the business.

The business users are the ones that carry out the activities that let the business achieve what it has to to exist. And anything that disrupts this process, or hinders it from being as efficient as it can be, is actually undesirable. This can include such things as the unnecessary installation of “new features”, to applications that don’t really fit with the activity that the business user carries out,

And this is why we have to start focusing less on the “T” (Technology) in IT. I’m not saying that the “T” is not important, but the “B” (“Business”) is also important. There needs to be more focus on the communication with the business. And not just “talking”, but actually more “listening”. And most importantly: “understanding”.

Understanding not only what the business is trying to do, but also how the business user carries out their tasks is incredibly valuable. Understand the business processes, and then configuring the technology in the best way to not only to meet what the business’ objective is, but also to take into account the way the user performs their tasks.

This leads to a more productive environment where the users feel that they are “involved” with the solution put into place, rather than feeling that the IT department has imposed the some cool, but not entirely useful, software solution on them.

We still need IT people who understand the “T”, but it’s the IT people who also understand the “B” and then can translate the “T” into something useful are the ones that are the most valuable.

  • Why I Hate IT…
  • Information chaos threatening to derail business, according to AIIM

“The New Normal” – my initial thoughts

I have been given a copy of Peter Hinssen’s “The New Normal“.

This book is about the

“advancement in technology” that “is creating a new ‘normal’ where relationships with consumers are increasingly in a digital form.”

Hinssen claims that we are “half way”, and that amazing things are going to be happening.

I’ve only just started reading the book, but here are my thoughts so far (as reviewed on  Goodreads) …

======================================

The New Normal: explore the limits of the digital worldThe New Normal: explore the limits of the digital world by Peter Hinssen

28 February 2012

Just started reading this book…but so far I am unimpressed.

Hinssen is telling us nothing new. Yes, technology has made a big jump. Yes, there are young people today who have never had to use an “analog” anything. Yes, for them digital is normal.

And – another thing that irks me is the concept that we are “half way”. How do we know that we are half way? Half way to what? Saying that implies that there is a defined endpoint. And then what?

As mentioned – I’ve only started reading this book (up to page 14). The things that I mention above are enough to make me want to keep reading. I want to see if Hinnsen moves away from this “wow – all this new technology” stance and offers something that isn’t self-evident. I also want to see whether he expands on this “half way” idea.

I will add to my comments once I have finished the book.

======================================

Here is a video that gives a “teaser” of his book…
[vodpod id=Video.16153305&w=425&h=350&fv=]

Related Links

  • “The New Normal” (on Peter Hinssen’s site)
  • Synopsis (by Peter Hinssen)
  • “The New Normal” (on Amazon.com)
  • My review of the book (on Goodreads.com)
  • My profile on Goodreads.com

My Diigo bookmarks for the week

  • tags: linkedin guide

  • tags: sharepoint template

  • tags: funny european Europe maps stereotypes

  • tags: pharma drugs licence

  • tags: organization hofstede CULTURE powerdistance behaviour

  • tags: sharepoint

  • tags: ReverendFun.com cartoon

  • is the circle crumbling?

    tags: Google+ socialmedia people

Posted from Diigo. The rest of my favorite links are here.

Post-move SharePoint site Comparison

Comparison sites SharePoint migration

Recently I’ve been involved with a client project that included moving some SharePoint sites from one web application to another as well, as moving document libraries from a top site to a sub-site.

While I work at the Business level (business systems analyst role), the move itself was done by client’s IT Infrastructure people. Fortunately they were smart enough to copy the content, instead of moving it. This was a brilliant idea, as it gave us the ability to have the original content still available.

Once the content had been moved the next step was to check that no documents had been missed. Now, the site owner (at the business level) had the best idea of what content would be stored in the doclibs, but as there were 64 of them, (some with 100 documents, many with documents in the thousands), doing a direct comparison was not easy. There was also the fact that the new locations had been “unfrozen” and people were uploading documents.

We investigated various ways to do a comparison. This involved creating special views for the docbases that would include only documents created before the “unfreeze” date, and then doing a screen by screen comparison. This was quickly deemed as not practical, and not handy, and bloody tiring.SharePoint comparison content doclibs sites

Then we tried exporting out the lists from the original location to spreadsheet, and then doing the same with the new location so that each list was in columns next to each other. And then doing a side-by-side comparison. This was definitely more practical, and we thought that it was a plausible solution. Until we discovered that for one of the doclibs there were 900 documents in the old location that were not in the new location.

Fortunately we came across a tool from MetaVis. The application suite of this product included a “Live Compare” feature. With this we were able to easily select one particular site in the left part of the screen, another site in the right screen, and then select the docbases that we wanted to compare. And then after clicking on the “Go and check the differences” button (it was actually titled “Compare Now”), we could see which documents were in the old location, and were not in the new location, and vice versa. This was great! And compared to manually comparing lists, was sooo much easier.

Meta Vis site comparison SharePoint

As well as any differences in content in the doclibs, we were also able to see small differences in other configurations. This was very handy.

Now – I know that the main functionality of the MetaVis tool is to do with migration, and architecting, but this “Live Compare” functionality certainly saved us a lot of time and frustration.  

The Use of Collaborative Software in Virtual Teams

I was delighted to discover a whitepaper by Eike Grotheer’s on “The Use of Collaborative Software in Virtual Teams”.

I’m interested in how “virtual teams” operate and work together, and so started reading his work. Then I realised that I had actually been part of his research. To gather data for his thesis, Eike had sent out  requests to participate in a survey in May 2010. (Google still has a cached copy of the survey). In November 2010, he sent out the results of his research. And I never looked at it!  (Kicking myself now, though!)

As I read Eike’s work I got even more excited – his research not only involved communication in virtual teams, he had used TAM (Technology Acceptance Model) to determine the effectiveness of the software.

(If you are not familiar with TAM (Technology Acceptance Model please check out my earlier posts: Predicting User Acceptance; and Applying (loosely) the Technology Adoption Model to a Real-Life situation)

Eike had used some pretty advanced statistical techniques to analyze his findings (Spearman’s rank correlation coefficient; Kruskal–Wallis one-way analysis of variance), and I won’t go into those in detail.

Survey Results Summarized
  • 265 people responded to the survey,
  • There was also a very large variety of tools in use (Microsoft Outlook, SharePoint, Microsoft Project Server, Lotus Notes, Lotus Sametime, Lotus Quickr, and Google Apps were all listed, along with other collaborative applications).
  • Most of the features that are frequently used can be split into two categories:
      • Tools for sharing and managing information (e.g.  document, content and knowledge management)
      • Tools for direct communication between team members

User Satisfaction and the Use of Collaborative Software in Virtual Teams

OK – this is where it started getting interesting. Eike rightly states that

the use of information systems can only provide a benefit to an organization if users first of all have interest in using them and then actually make use of them.

To try and explain this the Technology Acceptance Model was devised (refer earlier mentioned posts for more detail). It states that the a user’s intention to use a system is influenced by the perceived usefulness and  the perceived ease-of-use.

Eike analyzed these two determinants (perceived usefulness and perceived ease-of-use) to determine their impact on the use of collaborative software. (He points out that, as everyone who responded to the survey is already using collaborative software, the intention is already known, and that the use is measured.) 

Again, I won’t go into too much detail. In the survey there were 4 statements that were related to the perceived usefulness, and 4 statements that were related to perceived ease-of-use.

Performing a bivariate correlation analysis on the data from the survey, Eike was able to show that there was a positive correlation between the perceived usefulness and the actual use. This effectively proves (statistically) that the more users perceive collaborative software to be useful within a virtual team, the more they will use it. (Sounds logical, but then this fact means that the TAM can be verified).

Tackling the other determinant of the TAM, Eike did a bivariate correlation analysis between each perceived ease of use item, and the extent of use of collaborative software.

There was no significant correlation which meant that the ease of use of collaborative software  has only a minor effect on the usage behaviour. However, it wasn’t actually possible to draw a conclusion as the survey participants were all experienced IT users, and the difficulty of the software may not have prevented it being used.

Going further, Eike investigated the impact of TAM factors on project success. Again using statistics he was able to show that there was a positive correlation between perceived usefulness and project success, and between perceived ease-of-use and project success. This confirmed that a relationship between the use of collaborative software and project success does exist.

In other words, the more useful the participants perceived the collaboration software that was used in the virtual team to be, as well as how easy they thought it was to use, had a positive impact on the success of the project in all aspects.

Summing it up

Sometimes it is easy to think “well, that’s already obvious”, but I always find it valuable to be able to scientifically prove (in one way or another) what everyone assumes.

And that is why I found Eike’s research exciting. From a handful of well thought-out survey questions, he was able to scientifically prove that

if software is considered useful by its users, it enables them to become effective and productive in their work, and if it is easy to use, it enables them to make use of it straight away, and leads quickly to desired results. 

Other useful links:

  • Virtual Teams: Key Success Factors – Part 1
  • Virtual Teams: Key Success Factors – Part 2
  • Virtual Teams: Key Success Factors – Part 3
  • The Complexity of Virtual Teams

 

 

SPX Series – SharePoint eXperience – (aka SPX) – Series Introduction

This is part of the SPX Series

Hands up those of you who know what SPX is an acronym for.  (Hint – the answer is in the title of this post.)

SPX is the technology that CSC have that allows users, from a SharePoint interface, to interact with documents in a FirstDoc-Documentum system. (And, if you didn’t know – FirstDoc is a CSC’s Life Sciences compliance layer that sits on top of Documentum.) The technology consists of specially constructed web parts and a back-end Docway web server that acts as a “translator” for communication between the web parts and the Documentum server.

In fact, if you look at Andrew Chapman’s list of Reference Models, the SPX web parts would be the 3rd model listed.

Now – I have been working with SharePoint eXperience (SPX) technology for a while now – ever since the first version. I’ve been involved on a technical level as a customer. (That is, someone who has actually had to use the technology in a real-business environment to meet real-world requirements.)

As such, I thought it might be a good idea to start a series of posts on what the technology can do, along with some best practices. Here is a list of the things I will cover:

  • Overview – what SPX is, etc.
  • Best Practices – what are some of the best ways to configure/use SPX
  • Some of the issues that I have had to deal with
  • Anything else that I can think of.

Feedback from Readers is always great to receive, so if you feel that you have a question, or a suggestion, and I can answer it, I’ll certainly do my best.

Next post: SPX Series – A little bit of history

FirstDoc User Group 2011 – a look back at the conference – Part 2

Previous PostFirstDoc User Group 2011 – a look back at the conference – Part 1

In Part 1 of the FDUG 2011 series, I described the location of the meeting, and gave an overview of CSC’s plan and strategies. In Part 2, I’ll talk about the rest of the conference.

Going thin

After the break, two of the Pharma companies gave a presentation on a project that they were each involved with to upgrade their document management system.

I’m not at liberty to discuss the details, but it is obvious that the drivers in the pharmaceutical world are the same as in any other business. Namely,

  • Try and get as much functionality out of a product without writing customized code.
  • Aim to increase the useability of a solution
  • Make use of “thin” technology – (Portals).

The business cases presented described how CSC technology was being used to allow these goals to be met.  Always interesting to see, as this is a common theme.

Partnership Program

In the session  CSC described their “Partner Program” plans.

CSC’s goal here is to “put more effort into Partnerships to increase their usefulness.”  That is, with a good network of “CSC Partners”, CSC can meet client requirements, be able to offer more, and be more responsive (i.e. have more resources available) .

Companies that partner with CSC will fall into of three areas: Technology; Sales; Solution. Each area has its own “model” and KPIs that need to be met to be able to retain their status. “Customer Satisfaction” being the most important.

The message was that CSC want to seriously lift their game here. This will include certification, KPIs, working with the Partners to bring over a “unified” message.

Certification

As mentioned, CSC will be offering a certification program.

This will be made up of 4-tiered capabilities (Installation, Configuration, Customization, and Architecture). CSC are looking at some type of “boot camp” experience where individuals attend a week long course for each capability. This will be followed by several weeks of “shadowing” on client projects.

The fact that CSC mention this, signals that they want to set a standard that people that partner with them will meet. Which is encouraging. The “certification” is for the individual (that is, it’s not transferable to other people at the Partnersite).

Curious to see how this one will pan out.

Total Regulatory Solution

In the keynote presentation, there was mention of CSC’s “Total Regulatory Solution”.

Jennifer Wemstrom (who flew over to this year’s European FDUG) presented CSC’s overview of their “Total Regulatory Solution”.

Underpinning this is CSC’s aim to provide the “Total Business Solution” that supports the creation, management and consumption of regulatory documentation in the Life Sciences industry.

In simple words, CSC have got all the tools (especially since their acquisition of ISI and their Publishing tools) to achieve this, but the tools are still disparate applications. CSC’s goal is that all these disparate systems will be unified. They will have a common interface, and a use a shared data model.

This is definitely the right move. In my years as a ECM specialist I have seen companies grow through the acquisition of other companies that offer a solution that compliments, or even enhances, the parent companies offerings. The next logical step is to integrate the applications that make up the suite so that the user is presented with a seamless “solution”.

At the same time CSC seem to be actively investigating offering more than just a suite of technical products. They have realised that they have a lot of skill and knowledge in this area, and are talking about Business Process Outsourcing, and offering their Total Regulatory Solution as a managed service. (This ties in with CSC’s goal to dive into the cloud.)

New Product Offerings

CSC realise that there are still a few “gaps” in their offering. They are busy with  three new products. These are all to do with the submission end of the process. It looks like CSC are really listening to their customers.

Business Process Outsourcing

In this area CSC have three offerings:

  • Staff Augmentation – where CSC staff will work “side by side” with the customer;
  • Tactical Outsourcing – where CSC will handle specific aspects of the regulatory process.
  • Functional Outsourcing of regulatory activity.

As mentioned above, CSC definitely want to make good use of the skills & experience they have built up, and want to expand into offering services rather than just technology.

To back this up, CSC described how they will be tackling staff training (resource development). They have three levels which includes a sort of “orientation/induction” level, “core training” for regulatory activities, and then, “client specific training” which addresses the activities that a client has outsourced to CSC.

Managed Services

CSC have a series of Managed Service Models. These include the traditional models of “on premise” or “hosted” through to “As a service” which includes “Dedicated”, “Private Cloud”, and “Public Cloud”.  A flavor to suit all requirements.

FirstDocs 6.3

Bill Meier spent some time discussing the CSC’s latest version of FirstDoc (version 6.3) which include a large number of enhancements.

A few of the high points include the fact that this version will be certified on Linux.

…continued in Part 3

Next Post: FirstDoc User Group 2011 – a look back at the conference – Part 3