Home
/
Blog
/
How to Evaluate Website Platforms
4 Minutes

How to Evaluate Website Platforms

Five steps to accelerate your software evaluation.

The evaluation system for website platforms can be vastly different depending on your organization’s culture and the personalities of your project management team. By laying out a standardized evaluation approach, you can streamline the purchasing process, review platforms on equal footing and create a record of scores that can be revisited for future projects. Here are five steps for evaluating website platforms that can easily be applied to any other software solutions you plan to compare.

Tip: Accelerate your evaluation process by downloading our editable spreadsheet of evaluation criteria. It includes the criteria categories and calculations detailed below. Download Now

#1 - Define Evaluation Criteria

Defining evaluation criteria starts with gathering requirements at the beginning of a project. Take your user requests and split them into clear categories, with a list of features within each one. These features should be easy to define and measure. For instance, “good UI” is not a useful requirement, but “System simplifies the development of web pages leveraging themes, access rules and web components” is better.

There are five recommended areas of website platform criteria:

  • Cost: Established stack players such as IBM, Oracle and Microsoft have expensive platforms and complex maintenance and support pricing structures, but often compete with a wide network of partners. Open source platforms can offer more affordable options with a much wider network of specialists, thanks to the use of open standards and development methodologies.
  • Risk: Determine the acceptable risk level for long-term costs, integration capabilities, product chaining risk and end user satisfaction.
  • Control: Companies often need to strike a balance between centralized and distributed control to streamline decision making and maintain a level of team responsiveness.
  • End User Capabilities: No one likes a solution that’s difficult to use. Prioritizing solutions that are intuitive encourages user adoption. This is especially true for solutions reaching a diverse audience of users, such as portals or collaboration tools.
  • Heterogeneity: Companies can reduce maintenance and supports if they simplify the review and enforcement of organizational needs for product and content management at a high level, especially where central and departmental needs differ.

These areas can be organized by function instead, such as Portal or Social Collaboration. Just be careful not to overlook areas that aren’t tied to a specific feature, such as the risk of implementing a system that will force you to use more products from the same company down the line.

#2 - Quantify It

Quantifying the evaluation process makes the comparison easier and more accurate. Each vendor should be evaluated separately with the same table of criteria, then rated on a numerical scale. Often, stakeholders can become tied to a particular feature in one platform, and will argue for it without considering the overall strength of that solution. Quantifying your criteria ensures that you are giving weight to all criteria.

You can build your table of criteria based on the requirements from your team, or look for a sample of criteria online. Here is a sample from our  Liferay Buyer’s Guide Checklist, which is a platform-agnostic guide that identifies critical categories for website platforms and breaks out key features for each one.

Whether you build your own or use a template, the important thing is having the ability to assign those numerical values. Once you have that, you have deep insight into the strengths and weaknesses of each platform. You can compare vendors by category, by the total composite score, or you can weight the criteria that are more important for your project. We’ll discuss this more below.

#3 - Determine Use Cases

During the initial research process, vendors’ sales and marketing teams will likely present you with every possible scenario of how their platforms can be used. There are two ways to filter out the noise and focus on what’s relevant to your evaluation:

  • Choose one use case: By narrowing your scope to a single use case, you can limit the project requirements that you need to evaluate, and possibly eliminate some standard criteria altogether. This is especially useful if you’re choosing a platform for a standalone project that won’t need to integrate with other environments.
  • Set a timeline: Determine your project timeline before reviewing solutions, then prioritize the features that you’ll need during the initial implementation stages. This ensures that you’re taking into consideration functionality that you’ll need in the future, but not sacrificing the capabilities you know you need first.
Also See: 
Seven Popular Liferay Use Cases
Need help narrowing down which use case to focus on? Download our e-book to learn more about seven popular use cases, including ecommerce sites, self-service customer portals, public websites, and more.
Download the E-Book

#4 - Compare at Different Levels

There are three different “levels” you can (and should) use during your evaluation.

  • Overall Score: After all the vendors have been reviewed, look at the overall score for each. In our Buyer’s Guide sample, this is the total value of all the rightmost numbers added up, for a possible total of 1,465 points. Additionally, look at the distribution of scores throughout the table. It’s possible for a vendor to have a high overall score, but still have more 1’s than another vendor. Looking for an even distribution of high scores helps ensure consistency and accounts for the overall balance of the platform.
  • Category Score: Each platform can also be broken into categories — for example, Portal, Collaboration, Web Content Management and so forth. By comparing scores in the categories that are most important to your project, you can gain further insight to making a value judgment. Keep in mind that a vendor with the highest overall score can still score lower in the most important category for your project.
  • Weighted Score: Some criteria will be more important to your project than others. While one vendor may score well on three questions but poorly on one, a second vendor may do the opposite. If a vendor scored poorly on one of the most important features, then the final score may misrepresent the vendor. To adjust for these priorities, add a weight to each row after the table is completed. Project teams can use a 1 to 5 weight for each row, and then multiply to calculate the value placed in the rightmost column for each row. For example, if the vendor rates “4” for a question given a weight of 5, then the value placed in the rightmost column would be 20.

#5 - Don’t Rule Out Qualitative Evaluation

After your team has gone through the quantitative scores, you should counterbalance with a qualitative review. This should include review and discussion of the final scores, as well as a mix of the following:

  • Q&A demonstration
  • Proof of concept
  • Software trial
  • Reference customer outreach
  • Online evaluations

For these conversations, it’s important to bring in stakeholders from multiple departments, in order to account for concerns from all end users.

A Better Decision-Making Tool

An evaluation process isn’t meant to spit out one clear number to help you choose a platform. It’s a tool to guide your team in thinking through all the capabilities of a platform and be intentional in how you weigh the strengths and weaknesses of each one. By identifying, not just your requirements, but how those requirements impact one another and clarifying your priorities, your company is much more likely to choose a website platform that fits both your short-term and long-term needs.

How to Design a Website That Drives Conversions

Learn five steps for developing a proactive, user-centric design strategy for your website.

Read the E-Book  
Related Content
8db1c70e-46a7-4544-9668-3ede819a83c2
A Behind the Scenes Look at One Companys Digital Transformation
See how Britam overcame one of the most common barriers to transformation
3 Min Read
December 17, 2021
aeb2e8df-22b9-4a86-9938-1e437f9c40d7
A Guide to Building Great Websites with Liferay DXP 7.1
Learn what new & updated features in Liferay DXP 7.1 create great websites
3 Min Read
March 19, 2019
50d9c917-c1aa-4c8f-b4dc-563022a7b5c5
How to Prepare for the Future of Machine-Driven Search
Learn how intelligent information discovery will affect future SEO efforts
3 Min Read
December 17, 2021
Home
 / 
Blog
 / 
 / 
How to Evaluate Website Platforms
Text
4 Min Read

How to Evaluate Website Platforms

Five steps to accelerate your software evaluation.
Image
Share

The evaluation system for website platforms can be vastly different depending on your organization’s culture and the personalities of your project management team. By laying out a standardized evaluation approach, you can streamline the purchasing process, review platforms on equal footing and create a record of scores that can be revisited for future projects. Here are five steps for evaluating website platforms that can easily be applied to any other software solutions you plan to compare.

Tip: Accelerate your evaluation process by downloading our editable spreadsheet of evaluation criteria. It includes the criteria categories and calculations detailed below. Download Now

#1 - Define Evaluation Criteria

Defining evaluation criteria starts with gathering requirements at the beginning of a project. Take your user requests and split them into clear categories, with a list of features within each one. These features should be easy to define and measure. For instance, “good UI” is not a useful requirement, but “System simplifies the development of web pages leveraging themes, access rules and web components” is better.

There are five recommended areas of website platform criteria:

  • Cost: Established stack players such as IBM, Oracle and Microsoft have expensive platforms and complex maintenance and support pricing structures, but often compete with a wide network of partners. Open source platforms can offer more affordable options with a much wider network of specialists, thanks to the use of open standards and development methodologies.
  • Risk: Determine the acceptable risk level for long-term costs, integration capabilities, product chaining risk and end user satisfaction.
  • Control: Companies often need to strike a balance between centralized and distributed control to streamline decision making and maintain a level of team responsiveness.
  • End User Capabilities: No one likes a solution that’s difficult to use. Prioritizing solutions that are intuitive encourages user adoption. This is especially true for solutions reaching a diverse audience of users, such as portals or collaboration tools.
  • Heterogeneity: Companies can reduce maintenance and supports if they simplify the review and enforcement of organizational needs for product and content management at a high level, especially where central and departmental needs differ.

These areas can be organized by function instead, such as Portal or Social Collaboration. Just be careful not to overlook areas that aren’t tied to a specific feature, such as the risk of implementing a system that will force you to use more products from the same company down the line.

#2 - Quantify It

Quantifying the evaluation process makes the comparison easier and more accurate. Each vendor should be evaluated separately with the same table of criteria, then rated on a numerical scale. Often, stakeholders can become tied to a particular feature in one platform, and will argue for it without considering the overall strength of that solution. Quantifying your criteria ensures that you are giving weight to all criteria.

You can build your table of criteria based on the requirements from your team, or look for a sample of criteria online. Here is a sample from our  Liferay Buyer’s Guide Checklist, which is a platform-agnostic guide that identifies critical categories for website platforms and breaks out key features for each one.

Whether you build your own or use a template, the important thing is having the ability to assign those numerical values. Once you have that, you have deep insight into the strengths and weaknesses of each platform. You can compare vendors by category, by the total composite score, or you can weight the criteria that are more important for your project. We’ll discuss this more below.

#3 - Determine Use Cases

During the initial research process, vendors’ sales and marketing teams will likely present you with every possible scenario of how their platforms can be used. There are two ways to filter out the noise and focus on what’s relevant to your evaluation:

  • Choose one use case: By narrowing your scope to a single use case, you can limit the project requirements that you need to evaluate, and possibly eliminate some standard criteria altogether. This is especially useful if you’re choosing a platform for a standalone project that won’t need to integrate with other environments.
  • Set a timeline: Determine your project timeline before reviewing solutions, then prioritize the features that you’ll need during the initial implementation stages. This ensures that you’re taking into consideration functionality that you’ll need in the future, but not sacrificing the capabilities you know you need first.
Also See: 
Seven Popular Liferay Use Cases
Need help narrowing down which use case to focus on? Download our e-book to learn more about seven popular use cases, including ecommerce sites, self-service customer portals, public websites, and more.
Download the E-Book

#4 - Compare at Different Levels

There are three different “levels” you can (and should) use during your evaluation.

  • Overall Score: After all the vendors have been reviewed, look at the overall score for each. In our Buyer’s Guide sample, this is the total value of all the rightmost numbers added up, for a possible total of 1,465 points. Additionally, look at the distribution of scores throughout the table. It’s possible for a vendor to have a high overall score, but still have more 1’s than another vendor. Looking for an even distribution of high scores helps ensure consistency and accounts for the overall balance of the platform.
  • Category Score: Each platform can also be broken into categories — for example, Portal, Collaboration, Web Content Management and so forth. By comparing scores in the categories that are most important to your project, you can gain further insight to making a value judgment. Keep in mind that a vendor with the highest overall score can still score lower in the most important category for your project.
  • Weighted Score: Some criteria will be more important to your project than others. While one vendor may score well on three questions but poorly on one, a second vendor may do the opposite. If a vendor scored poorly on one of the most important features, then the final score may misrepresent the vendor. To adjust for these priorities, add a weight to each row after the table is completed. Project teams can use a 1 to 5 weight for each row, and then multiply to calculate the value placed in the rightmost column for each row. For example, if the vendor rates “4” for a question given a weight of 5, then the value placed in the rightmost column would be 20.

#5 - Don’t Rule Out Qualitative Evaluation

After your team has gone through the quantitative scores, you should counterbalance with a qualitative review. This should include review and discussion of the final scores, as well as a mix of the following:

  • Q&A demonstration
  • Proof of concept
  • Software trial
  • Reference customer outreach
  • Online evaluations

For these conversations, it’s important to bring in stakeholders from multiple departments, in order to account for concerns from all end users.

A Better Decision-Making Tool

An evaluation process isn’t meant to spit out one clear number to help you choose a platform. It’s a tool to guide your team in thinking through all the capabilities of a platform and be intentional in how you weigh the strengths and weaknesses of each one. By identifying, not just your requirements, but how those requirements impact one another and clarifying your priorities, your company is much more likely to choose a website platform that fits both your short-term and long-term needs.

How to Design a Website That Drives Conversions

Learn five steps for developing a proactive, user-centric design strategy for your website.

Read the E-Book  
Originally published
August 15, 2017
 last updated
October 6, 2022
Topics:

See how you can build a solution fit for your needs

1400 Montefino Avenue
Diamond Bar, CA 91765
USA
+1-877-LIFERAY
Built on Liferay Digital Experience Platform