Marketing Deal Offers

  • Subscribe to our RSS feed.
  • Twitter
  • StumbleUpon
  • Reddit
  • Facebook
  • Digg

Tuesday, 9 December 2008

Measuring Usability: A Task-Based Approach

Posted on 12:09 by Unknown

I think we all know that the simplest practical measure of intelligence is how often someone agrees with you. On that scale, University of Ottawa Professor Timothy Lethbridge must be some kind of genius, because his course notes on Software Usability express my opinions on the topic even better and in more detail than I’ve yet to do for myself. Specifically, he lists the following basic process for measuring usability:

- understand your users, and recognize that they fall into different classes
- understand the tasks that users will perform with the system
- pick a representative set of tasks
- pick a representative set of users
- define the questions you want to answer about usability
- pick the metrics that answer those questions
- have the users perform the tasks and measure their performance

This is very much the approach that I’ve been writing about, in pretty much the same words. Happily, Lethbridge provides additional refinement of the concepts. Just paging through his notes, some of his suggestions include:

- classifying users in several dimensions, including the job type, experience with the tasks, general computer experience, personality type, and general abilities (e.g. language skills, physical disabilities, etc.). I’d be more specific and add skills such as analytical or technical knowledge.

- defining tasks based on use cases (I tend to call these business processes, but it’s pretty much the same); understanding how often each task is performed, how much time it takes, and how important it is; and testing different tasks for different types of users. “THIS STEP CAN BE A LOT OF WORK” the notes warn us, and, indeed, building the proper task list is probably the hardest step in the whole process.

- a list of metrics:

- proficiency, defined as the time to complete the chosen tasks. That strikes me as an odd label, since I usually think of proficiency as an attribute of a user not a system. The obvious alternative is efficiency, but as we’ll see in a moment, he uses that for something else. Maybe “productivity” would be better; I think this comes close to the standard definition of labor productivity as output per hour.

- learnability, defined as time to reach a specified level of proficiency.

- efficiency, defined as proficiency of an expert. There’s no corresponding term for “proficiency of a novice”, which I think there should be. So maybe what you really need is “expert efficiency” and “novice efficiency”, or “expert and novice “productivity”, and discard “proficiency” altogether.

- memorability, defined as proficiency after a period of non-use. If you discard proficiency, this could be “efficiency (or productivity) after a period of non-use”, which makes just as much sense.

- error handling, defined as number or time spent on deviations from the ideal way to perform a task. I’m not so sure about this one. After all, time spent on deviations is part of total time spent, which is already captured in proficiency or efficiency or whatever you call it. I’d rather see a measure of error rate, which would be defined as number or percentage of tasks performed correctly (by users with a certain level of training). Now that I think about it, none of Lethbridge’s measures incorporate any notion of output quality—a rather curious and important omission.

- satisfaction, defined subjectively by users on a scale of 1 to 5.

- plot a “learning curve” on the two dimensions of proficiency and training / practice time; the shape of the curve provides useful insights into novice productivity (what can new users do without any training); learnability (a steep early curve means people learn the system quickly) and eventual efficiency (the level of proficiency where the curve flattens out).

- even expert users may not make best use of the system if stop learning before they master all its features. So they system should lead them to explore new features by offering tips or making contextual suggestions.

At this point, we’re about half way through the notes. The second half provides specific suggestions on:

- measuring learnability (e.g. by looking at features that make systems easy to learn);

- causes of efficiency problems (e.g. slow response time, lack of an easy step-by-step route to perform a task);

- choosing experts and what to do when experts are unavailable (basically, plot of learning curve of new users);

- measuring memorability (which may involve different retention periods for different types of tasks; and should also distinguish between frequently and infrequently used tasks, with special attention to handling emergencies)

- classifying errors (based on whether they were caused by user accidents or confusion [Lethbridge says that accidents are not the system’s fault while confusion is; this is not a distinction I find convincing]; also based on whether the user discovers them immediately or after some delay, the system points them out, or they are never made known to the user)

- measuring satisfaction (surveys should be based on real and varied work rather than just a few small tasks, should be limited to 10-15 questions, should use a “Likert Scale” of strongly agree to strongly disagree, and should vary the sequence and wording of questions)

- measuring different classes of users (consider their experience with computers, the application domain and the system being tested; best way to measure proficiency differences is to compare the bottom 25% of users with the 3rd best 25%, since this will eliminate outliers)

This is all good stuff. Of course, my own interest is applying it to measuring usability for demand generation systems. My main take-aways for that are:

1. defining user types and tasks to measure are really important. But I knew that already.

2. choosing the actual metrics takes more thought than I’ve previously given it. Time to complete the chosen tasks (I think I’ll settle on calling it productivity) is clearly the most important. But learnability (which I think comes down to time to reach a specified level of expertise) and error rate matter too.

For marketing automation systems in particular, I think it’s reasonable to assume that all users will be trained in the tasks they perform. (This isn’t the case for other systems, e.g. ATM machines and most consumer Web sites, which are used by wholly untrained users.) The key to this assumption is that different tasks will be the responsibility of different users; otherwise, I’d be assuming that all users are trained in everything. So it does require determining which users will do which tasks in different systems.

On the other hand, assuming that all tasks are performed by experts in those tasks does mean that someone who is expert in all tasks (e.g., a vendor sales engineer) can actually provide a good measure of system productivity. I know this is a very convenient conclusion for me to reach, but I swear I didn’t start out aiming for it. Still, I do think it’s sound and it may provide a huge shortcut in developing usability comparisons for the Raab Guide. What is does do is require a separate focus on learnability so we don’t lose sight of that one. I’m not sure what to do about error rate, but do know it has to be measured for experts, not novices. Perhaps when we set up the test tasks, we can involve specific contents that can later be checked for errors. Interesting project, this is.

3. the role of surveys is limited. This is another convenient conclusion, since statistically meaningful surveys would require finding a large number of demand generation system users and gathering detailed information about their levels of expertise. It would still be interesting to do some preliminary surveys of marketers to help understand the tasks they find important and, to the degree possible, to understand the system features they like or dislike. But the classic usability surveys that ask users how they feel about their systems are probably not necessary or even very helpful in this situation.

This matters because much of the literature I’ve seen treats surveys as the primary tool in the usability measurement. This is why I am relieved to find an alternative.

As an aside: many usability surveys such as SUMI (Software Usability Measurement Inventory) are proprietary. My research did turn up what looks like a good public version
Measuring Usability with the USE Questionnaire by Arnold M. Lund from the
Society for Technical Communication (STC) Usability SIG Newsletter of October 2001. The acronym USE stands for the three main categories: Usefulness, Satisfaction and Ease of Use/Ease of Learning. The article provides a good explanation of the logic behind the survey, and is well worth reading if you’re interested in the topic. The questions, which would be asked on a 7-point Likert Scale, are:

Usefulness
- It helps me be more effective.
- It helps me be more productive.
- It is useful.
- It gives me more control over the activities in my life.
- It makes the things I want to accomplish easier to get done.
- It saves me time when I use it.
- It meets my needs.
- It does everything I would expect it to do.

Ease of Use
- It is easy to use.
- It is simple to use.
- It is user friendly.
- It requires the fewest steps possible to accomplish what I want to do with it.
- It is flexible.
- Using it is effortless.
- I can use it without written instructions.
- I don't notice any inconsistencies as I use it.
- Both occasional and regular users would like it.
- I can recover from mistakes quickly and easily.
- I can use it successfully every time.

Ease of Learning
- I learned to use it quickly.
- I easily remember how to use it.
- It is easy to learn to use it.
- I quickly became skillful with it.

Satisfaction
- I am satisfied with it.
- I would recommend it to a friend.
- It is fun to use.
- It works the way I want it to work.
- It is wonderful.
- I feel I need to have it.
- It is pleasant to use.

Apart from the difficulties of recruiting and analyzing a large enough number of respondents, this type of survey only gives a general view of the product in question. In the case of demand generation, this wouldn’t allow us to understand the specific strengths and weaknesses of different products, which is a key objective of any comparative research. Any results from this sort of survey would be interesting in their own right, but couldn’t themselves provide a substitute for the more detailed task-based research.

Email ThisBlogThis!Share to XShare to Facebook
Posted in demand generation, marketing automation, marketing process, software selection, software usability measurement | No comments
Newer Post Older Post Home

0 comments:

Post a Comment

Subscribe to: Post Comments (Atom)

Popular Posts

  • eBay Offers $2.4 Billion for GSI Commerce: More Support for Marketing Automation
    eBay ’s $2.4 billion offer for e-commerce services giant GSI Commerce has been described largely in terms of helping eBay to compete with ...
  • Infer Keeps It Simple: B2B Lead Scores and Nothing Else
    I’ve nearly finished gathering information from vendors for my new study on Customer Data Platform systems and have started to look for patt...
  • Selligent Brings a New B2C Marketing Automation Option to the U.S.
    I’m writing this post on my old DOS-based WordPerfect software, to get in the proper mood for discussing business-to-consumer marketing auto...
  • NICE Buys Causata to Extend Its Customer Experience Management Position
    So, there I was around 7:30 Eastern time this morning, sending out reminder notices to vendors I need to interview for an upcoming report on...
  • thinkAnalytics Helps Marketers Optimize Customer Treatments
    Summary: thinkAnalytics provides a robust decision engine to help make optimal recommendations across channels. Too bad more people don...
  • So Many Measures, So Little Time
    I’ve been collating lists of marketing performance metrics from different sources, which is exactly as much fun as it sounds. One result th...
  • 4 Marketing Tech Trends To Watch in 2014
    I'm not a big fan of year-end summaries and forecasts, mostly because I produce summaries and forecasts all year round.  But I pulled to...
  • Marketo Raises Another $50 Million: Where Does the Money Go?
    Marketo this morning announced a new $50 million funding round, almost exactly one year to the day after raising $25 million in November 2...
  • James Taylor on His New Book
    A few months ago, James Taylor of Fair Isaac asked me to look over a proof of Smart (Enough) Systems , a book he has co-written with indust...
  • Vocus Marketing Suite: Still Mostly Social But Marketing Automation is On the Way
    If you’ve heard of Vocus at all, it’s probably as vendor serving public relations professionals. Its core offerings include a huge databas...

Categories

  • [x+1]
  • 1010Data
  • 2009 trends
  • 2010 predictions
  • 2011 predictions
  • 2013 marketing automation revenues
  • 2014 predictions
  • account data in marketing systems
  • acquisitions
  • acquistions
  • act-on software
  • active conversion
  • activeconversion
  • acxiom
  • ad agencies
  • ad servers
  • adam needles
  • adobe
  • adometry
  • advertising effectiveness
  • advocate management
  • affiliate marketing
  • agilone
  • aida model
  • aimatch
  • algorithmic attribution
  • alterian
  • analysis systems
  • analytical database
  • analytical databases
  • analytical systems
  • analytics tools
  • app exchange
  • app marketplace
  • application design
  • aprimo
  • are
  • artificial intelligence
  • ascend2
  • asset management
  • assetlink
  • atg
  • attribution analysis
  • attribution models
  • automated decisions
  • automated dialog
  • automated modeling
  • autonomy
  • b2b demand generation
  • b2b demand generation systems
  • b2b email marketing benchmarks
  • b2b lead scoring
  • b2b marketing
  • b2b marketing automation
  • b2b marketing automation industry consolidation
  • b2b marketing automation industry growth rate
  • b2b marketing automation revenues
  • b2b marketing automation systems
  • b2b marketing automation vendor rankings
  • b2b marketing data
  • b2b marketing industry consolidation
  • b2b marketing strategy
  • b2b marketing system comparison
  • b2c marketing automation
  • b2c marketing automation vendors
  • balanced scorecard
  • balihoo
  • barriers to marketing success
  • barry devlin
  • beanstalk data
  • behavior detection
  • behavior identification
  • behavior targeting
  • behavioral data
  • behavioral targeting
  • big data
  • birst
  • bislr
  • blogging software
  • brand experience
  • brand marketing
  • business intelligence
  • business intelligence software
  • business intelligence systems
  • business marketing
  • businses case
  • callidus
  • campaign flow
  • campaign management
  • campaign management software
  • causata
  • cdi
  • cdp
  • channel management
  • channel marketing
  • channel partner management
  • chordiant
  • cio priorities
  • clickdimensions
  • clicksquared
  • clientxclient
  • cloud computing
  • cmo surveys
  • cms
  • collaboration software
  • column data store
  • column-oriented database
  • columnar database
  • community management
  • compare marketing automation vendors
  • compiled data
  • complex event processing
  • consumer marketing
  • contact center systems
  • content aggregation
  • content distribution
  • content grazing
  • content management
  • content marketing
  • content matrix
  • content recommendations
  • content selections
  • content syndication
  • context automation
  • conversen
  • coremetrics
  • crm
  • crm integration
  • CRM lead scores
  • crm software
  • crm systems
  • crmevolution
  • cross-channel marketing
  • crowd sourcing
  • custom content
  • custom media
  • customer database
  • customer analysis
  • customer data
  • customer data integration
  • customer data management
  • customer data platform
  • customer data platforms
  • customer data quality
  • customer data warehouse
  • customer database
  • customer experience
  • customer experience management
  • customer experience matrix
  • customer information
  • customer management
  • customer management software
  • customer management systems
  • customer metrics
  • customer relationship management
  • customer satisfaction
  • customer success
  • customer support
  • cxc matrix
  • dashboards
  • data analysis
  • data cleaning
  • data cleansing
  • data enhancement
  • data integration
  • data loading
  • data mining
  • data mining and terrorism
  • data quality
  • data transformation tools
  • data visualization
  • data warehouse
  • database management
  • database marketing
  • database marketing systems
  • database technology
  • dataflux
  • datallegro
  • datamentors
  • david raab
  • david raab webinar
  • david raab whitepaper
  • day software
  • decision engiens
  • decision engines
  • decision management
  • decision science
  • dell
  • demand generation
  • demand generation implementation
  • demand generation industry
  • demand generation industry growth rate
  • demand generation industry size
  • demand generation industry trends
  • demand generation marketbright
  • demand generation marketing automation
  • demand generation software
  • demand generation software revenues
  • demand generation systems
  • demand generation vendors
  • demandforce
  • digiday
  • digital marketing
  • digital marketing systems
  • digital messaging
  • distributed marketing
  • dmp
  • dreamforce
  • dreamforce 2012
  • dynamic content
  • ease of use
  • ebay
  • eglue
  • eloqua
  • eloqua10
  • elqoua ipo
  • email
  • email marketing
  • email service providers
  • engagement engine
  • enteprise marketing management
  • enterprise decision management
  • enterprise marketing management
  • enterprise software
  • entiera
  • epiphany
  • ETL
  • eTrigue
  • event detection
  • event stream processing
  • event-based marketing
  • exacttarget
  • facebook
  • feature checklists
  • flow charts
  • fractional attribution
  • freemium
  • future of marketing automation
  • g2crowd
  • gainsight
  • Genius.com
  • genoo
  • geotargeting
  • gleanster
  • governance
  • grosocial
  • gsi commerce
  • high performance analytics
  • hiring consultants
  • hosted software
  • hosted systems
  • hubspot
  • ibm
  • impact of internet on selling
  • importance of sales execution
  • in-memory database
  • in-site search
  • inbound marketing
  • industry consolidation
  • industry growth rate
  • industry size
  • industry trends
  • influitive
  • infor
  • information cards
  • infusioncon 2013
  • infusionsoft
  • innovation
  • integrated customer management
  • integrated marketing management
  • integrated marketing management systems
  • integrated marketing systems
  • integrated systems
  • intent measurement
  • interaction advisor
  • interaction management
  • interestbase
  • interwoven
  • intuit
  • IP address lookup
  • jbara
  • jesubi
  • king fish media
  • kwanzoo
  • kxen
  • kynetx
  • large company marketing automation
  • last click attribution
  • lead capture
  • lead generation
  • lead management
  • lead management software
  • lead management systems
  • lead managment
  • lead ranking
  • lead scoring
  • lead scoring models
  • leadforce1
  • leadformix
  • leading marketing automation systems
  • leadlander
  • leadlife
  • leadmd
  • leftbrain dga
  • lifecycle analysis
  • lifecycle reporting
  • lifetime value
  • lifetime value model
  • local marketing automation
  • loopfuse
  • low cost marketing software
  • low-cost marketing software
  • loyalty systems
  • lyzasoft
  • makesbridge
  • manticore technology
  • mapreduce
  • market consolidation
  • market software
  • market2lead
  • marketbight
  • marketbright
  • marketgenius
  • marketing analysis
  • marketing analytics
  • marketing and sales integration
  • marketing automation
  • marketing automation adoption
  • marketing automation benefits
  • marketing automation consolidation
  • marketing automation cost
  • marketing automation deployment
  • marketing automation features
  • marketing automation industry
  • marketing automation industry growth rate
  • marketing automation industry trends
  • marketing automation market share
  • marketing automation market size
  • marketing automation maturity model
  • marketing automation net promoter score. marketing automation effectiveness
  • marketing automation pricing
  • marketing automation software
  • marketing automation software evaluation
  • marketing automation success factors
  • marketing automation system deployment
  • marketing automation system evaluation
  • marketing automation system features
  • marketing automation system selection
  • marketing automation system usage
  • marketing automation systems
  • marketing automation trends
  • marketing automation user satisfaction
  • marketing automation vendor financials
  • marketing automation vendor selection
  • marketing automation vendor strategies
  • marketing automion
  • marketing best practices
  • marketing cloud
  • marketing content
  • marketing data
  • marketing data management
  • marketing database
  • marketing database management
  • marketing education
  • marketing execution
  • marketing funnel
  • marketing integration
  • marketing lead stages
  • marketing management
  • marketing measurement
  • marketing mix models
  • marketing operating system
  • marketing operations
  • marketing optimization
  • marketing performance
  • marketing performance measurement
  • marketing platforms
  • marketing priorities
  • marketing process
  • marketing process optimization
  • marketing resource management
  • marketing return on investment
  • marketing ROI
  • marketing sales alignment
  • marketing service providers
  • marketing services
  • marketing services providers
  • marketing skills gap
  • marketing software
  • marketing software evaluation
  • marketing software industry trends
  • marketing software product reviews
  • marketing software selection
  • marketing software trends
  • marketing softwware
  • marketing suites
  • marketing system architecture
  • marketing system evaluation
  • marketing system ROI
  • marketing system selection
  • marketing systems
  • marketing technology
  • marketing tests
  • marketing tips
  • marketing to sales alignment
  • marketing training
  • marketing trends
  • marketing-sales integration
  • marketingpilot
  • marketo
  • marketo funding
  • marketo ipo
  • master data management
  • matching
  • maturity model
  • meaning based marketing
  • media mix models
  • message customization
  • metrics
  • micro-business marketing software
  • microsoft
  • microsoft dynamics crm
  • mid-tier marketing systems
  • mindmatrix
  • mintigo
  • mma
  • mobile marketing
  • mpm toolkit
  • multi-channel marketing
  • multi-language marketing
  • multivariate testing
  • natural language processing
  • neolane
  • net promoter score
  • network link analysis
  • next best action
  • nice systems
  • nimble crm
  • number of clients
  • nurture programs
  • officeautopilot
  • omnichannel marketing
  • omniture
  • on-demand
  • on-demand business intelligence
  • on-demand software
  • on-premise software
  • online advertising
  • online advertising optimization
  • online analytics
  • online marketing
  • open source bi
  • open source software
  • optimization
  • optimove
  • oracle
  • paraccel
  • pardot
  • pardot acquisition
  • partner relationship management
  • pay per click
  • pay per response
  • pedowitz group
  • pegasystems
  • performable
  • performance marketing
  • personalization
  • pitney bowes
  • portrait software
  • predictive analytics
  • predictive lead scoring
  • predictive modeling
  • privacy
  • prospect database
  • prospecting
  • qliktech
  • qlikview
  • qlikview price
  • raab guide
  • raab report
  • raab survey
  • Raab VEST
  • Raab VEST report
  • raab webinar
  • reachedge
  • reachforce
  • real time decision management
  • real time interaction management
  • real-time decisions
  • real-time interaction management
  • realtime decisions
  • recommendation engines
  • relationship analysis
  • reporting software
  • request for proposal
  • reseller marketing automation
  • response attribution
  • revenue attribution
  • revenue generation
  • revenue performance management
  • rfm scores
  • rightnow
  • rightwave
  • roi reporting
  • role of experts
  • rule-based systems
  • saas software
  • saffron technology
  • sales automation
  • sales best practices
  • sales enablement
  • sales force automation
  • sales funnel
  • sales lead management association
  • sales leads
  • sales process
  • sales prospecting
  • salesforce acquires exacttarget
  • salesforce.com
  • salesgenius
  • sap
  • sas
  • score cards
  • search engine optimization
  • search engines
  • self-optimizing systems
  • selligent
  • semantic analysis
  • semantic analytics
  • sentiment analysis
  • service oriented architecture
  • setlogik
  • setlogik acquisition
  • silverpop
  • silverpop engage
  • silverpop engage b2b
  • simulation
  • sisense prismcubed
  • sitecore
  • small business marketing
  • small business software
  • smarter commerce
  • smartfocus
  • soa
  • social campaign management
  • social crm
  • social marketing
  • social marketing automation
  • social marketing management
  • social media
  • social media marketing
  • social media measurement
  • social media monitoring
  • social media roi
  • social network data
  • software as a service
  • software costs
  • software deployment
  • software evaluation
  • software satisfaction
  • software selection
  • software usability
  • software usability measurement
  • Spredfast
  • stage-based measurement
  • state-based systems
  • surveillance technology
  • sweet suite
  • swyft
  • sybase iq
  • system deployment
  • system design
  • system implementation
  • system requirements
  • system selection
  • tableau software
  • technology infrastructure
  • techrigy
  • Tenbase
  • teradata
  • test design
  • text analysis
  • training
  • treehouse international
  • trigger marketing
  • twitter
  • unica
  • universal behaviors
  • unstructured data
  • usability assessment
  • user interface
  • vendor comparison
  • vendor evaluation
  • vendor evaluation comparison
  • vendor rankings
  • vendor selection
  • vendor services
  • venntive
  • vertica
  • visualiq
  • vocus
  • vtrenz
  • web analytics
  • web contact management
  • Web content management
  • web data analysis
  • web marketing
  • web personalization
  • Web site design
  • whatsnexx
  • woopra
  • youcalc
  • zoho
  • zoomix

Blog Archive

  • ►  2013 (55)
    • ►  December (4)
    • ►  November (5)
    • ►  October (4)
    • ►  September (3)
    • ►  August (5)
    • ►  July (5)
    • ►  June (5)
    • ►  May (6)
    • ►  April (6)
    • ►  March (1)
    • ►  February (6)
    • ►  January (5)
  • ►  2012 (56)
    • ►  December (4)
    • ►  November (3)
    • ►  October (6)
    • ►  September (4)
    • ►  August (7)
    • ►  July (3)
    • ►  June (4)
    • ►  May (5)
    • ►  April (3)
    • ►  March (4)
    • ►  February (8)
    • ►  January (5)
  • ►  2011 (74)
    • ►  December (9)
    • ►  November (8)
    • ►  October (6)
    • ►  September (5)
    • ►  August (5)
    • ►  July (3)
    • ►  June (6)
    • ►  May (5)
    • ►  April (6)
    • ►  March (8)
    • ►  February (7)
    • ►  January (6)
  • ►  2010 (75)
    • ►  December (9)
    • ►  November (9)
    • ►  October (5)
    • ►  September (6)
    • ►  August (7)
    • ►  July (3)
    • ►  June (6)
    • ►  May (9)
    • ►  April (4)
    • ►  March (6)
    • ►  February (6)
    • ►  January (5)
  • ►  2009 (96)
    • ►  December (2)
    • ►  November (4)
    • ►  October (5)
    • ►  September (9)
    • ►  August (7)
    • ►  July (16)
    • ►  June (9)
    • ►  May (5)
    • ►  April (11)
    • ►  March (11)
    • ►  February (11)
    • ►  January (6)
  • ▼  2008 (59)
    • ▼  December (6)
      • ADVIZOR's In-Memory Database Supports Powerful Vis...
      • Simplifying Demand Generation Usability Assessment...
      • A Modest Proposal for Demand Generation Usability ...
      • Measuring Usability: A Task-Based Approach
      • Two Interesting Blogs on Demand Generation
      • Pardot Offers Refined Demand Generation at a Small...
    • ►  November (3)
    • ►  October (8)
    • ►  September (1)
    • ►  August (5)
    • ►  July (8)
    • ►  June (5)
    • ►  May (5)
    • ►  April (6)
    • ►  March (3)
    • ►  February (3)
    • ►  January (6)
  • ►  2007 (84)
    • ►  December (4)
    • ►  November (6)
    • ►  October (6)
    • ►  September (1)
    • ►  August (4)
    • ►  July (7)
    • ►  June (16)
    • ►  May (20)
    • ►  April (20)
Powered by Blogger.

About Me

Unknown
View my complete profile