0% found this document useful (0 votes)
12 views

How to Interpret Partworth Utilities - Conjointly

The document explains partworth utilities in conjoint analysis, which measure how different attributes influence customer decisions, using a mobile plan example. It details the importance of attributes like data included and price, and how level partworths reveal consumer preferences for specific features. Additionally, it discusses the distribution of preferences and ranked product concepts based on customer choices, emphasizing the importance of understanding these metrics for effective product development.

Uploaded by

vocemo8649
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
12 views

How to Interpret Partworth Utilities - Conjointly

The document explains partworth utilities in conjoint analysis, which measure how different attributes influence customer decisions, using a mobile plan example. It details the importance of attributes like data included and price, and how level partworths reveal consumer preferences for specific features. Additionally, it discusses the distribution of preferences and ranked product concepts based on customer choices, emphasizing the importance of understanding these metrics for effective product development.

Uploaded by

vocemo8649
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 8



How to Interpret Partworth Utilities


Partworth utilities (also known as attribute importance scores and level values, or simply as conjoint
analysis utilities) are numerical scores that measure how much each feature influences the customer’s
decision to select an alternative. Because partworths of attributes and levels in conjoint analysis are
interrelated, in this post we will look at them using the same example of a mobile plan.
Suppose a company wants to find out customers’ preferences for mobile plan to reassess its product
range, as a pathway to growth. They are investigating the following four attributes, to see which
combination of levels within the attributes creates an optimal plan.
Price: $70 per month, $50 per month, $30 per month
Data included: 500MB, 1GB, 10GB, unlimited
International minutes included: 0 min, 90 min, 300 min
SMS included: 300 messages, Unlimited text

Relative importance by attribute (Attribute


partworths)

The relative importance of each attribute shows its importance relative to other attributes. Values in
this chart sum up to 100%. At 43.8%, the “Data included” attribute turns out to be the most important
attribute with “International minutes included” being the least important attribute. It appears price is not
as important a factor as “Data included”.
Attribute partworths are calculated as the average of each respondent’s attribute partworths utility.
Each respondent’s attribute partworths are calculated as the range of preferences for levels within an
attribute.

This chart shows how strongly the variations of attributes affect customers’ choice, but only for the
levels that you chose in the design. If a more extreme level were added to one of the attributes, that
attribute would likely become higher in importance. For example, if we add a more extreme price level
($150 per month), customers are likely to shun it and therefore the partworth of that level will be very
negative, which will in turn inflate the importance
Listofofthe whole price attribute.
articles
Relative value by level (Level partworths)


Level partworths allow you to dive deeper to understand what specific levels within an attribute drive
customers’ choice. In this example, unlimited data plan is strongly preferred to 500MB data and 1GB
plan and somewhat to 10GB data plan.

Level partworths are calculated based on the average preference scores for each level. Levels that are
strongly preferred by customers are assigned higher scores, levels that perform poorly (in comparison)
are assigned lower scores. The chart is scaled so that, for each attribute, the sum of all positive values
equals (the absolute value of) the sum of all negative values.

Again, it is important to remember that these partworths are relative. If we add a new level for the
attribute “Data included”, the relative value of each level will change.

Distribution of preferences for levels

List of articles
This chart shows the distribution of preferences for various levels. It answers the question: Assuming


that each consumer has a preference for different levels, what is the distribution of preferences for
different levels (within each attribute) across consumers?

This information allows you to dive deeper to understand the preference distribution for various levels
within an attribute. In this example regarding data plans, 83.2% of total preference goes to unlimited,
10.3% for 10GB, 3.6% for 1GB, 2.9% for 500MB. Unlimited data plan is strongly preferred to 500MB data
and 1GB plan and somewhat to 10GB data plan.

Distribution of preferences for levels are based on the ratio of preference scores for levels within the
attribute for each respondent. Levels with high percentages of preferences are more preferred within
attribution across all respondents.

Distribution of most preferred levels

This chart shows the distribution of levels most preferred by consumers. It answers the question:
Assuming that each consumer likes different levels, how many consumers have each level as their
most preferred

Following the same example of mobile data plans, 97.9% of respondents most-preferred plan unlimited
data, with 2.1% of respondents most preferring 1GB. Therefore, unlimited data is easily the most
preferred level among mobile data.
Distribution of most preferred levels is calculated with the percentage of respondents choosing the
specific level as the top preferred option within attributes. A high percentage of this value represents
the level is most preferred by a big percentage of consumers.

Ranked list of product concepts


For each report, Conjointly will present a “ranked list of product concepts as preferred by customers”.
This is a list of some (but very often not all — because they can count in millions) potential
combinations of features and prices that represent product concepts. The column “Value to
customers’’ will contain a single number for each concept combination (row) with specific features and
price levels. This number is calculated as the average partworths across individual respondent’s total
partworth utility scores for the combination. It is scaled with 0 as the average value.

The first rank combination is the most preferred concept across respondents. In this example, a mobile
plan of $30 per month, unlimited data included,List
300ofmin internal included and unlimited text included
articles
has the highest overall utilities across respondents.


If there are too many possible combinations (over 500), the system will take a sample of 500
combinations and present them in the ranked list. Therefore, the list is not always exhaustive.

Distribution of preferences for brands


For Brand-Specific Conjoint, Conjointly presents the distribution of preferences for different concepts
by brand. You can assess the chart via Brand preferences under the Insight tab.

This violin chart compares preferences for brands and helps you identify which brands have more or
less variation by the constituent concept. Each violin-shaped plot shows the scores of different
combinations of features within each brand/SKU. Median values are shown as the diamonds in the
middle of the violin.
The width across each violin shows how scattered are the scores of different concepts. In areas where
you have a wider violin plot, that’s where you have more concepts. In the example above, Kea Rocketta
has more concepts on the lower end, while Maruda Maru II and Ladina Klubnika have a relatively evenly
distributed spread.
You can compare the chart with the Ranked list of product concepts as preferred by customers
below to get a better understanding of the spread. Again, in this example, Kea Rocketta has two
concepts on the low end and one on the high end, resulting in a wider plot on the low end.

List of articles


In interpreting the violin chart, it is important to remember that the relative performance of a brand will
be affected by the features that are applied to that brand, especially if one of the brands was shown
with unusual or unrealistic features or price levels.
The scale of this chart is arbitrary, but it is consistent with the ranked list of product concepts
described above. But it is not consistent with attribute partworths and level partworths.

FAQs
Do partworth utilities show variability of preferences across
consumers?
No, partworth utilities (both attribute importance scores and level preference scores) show only mean
(average) preferences and importances. In order to gauge variability of preferences, you may want to
look at the distribution of individual-level HB co-efficients or perform simulations.

Why do importance scores always sum up to 100% in Brand-


Specific Conjoint?
In Brand-Specific Conjoint, importance scores within each brand still sum up to 100% within each
brand, even though the degree of variation in the worst and best concepts within each brand is
different across brands. It is so that the importance scores maintain their percentage scale.

Next steps
Learn more about how to calculate partworth utilities.
Review an example report on preferences in ice-cream.
Learn more about preferences scores in claims tests.
See more example conjoint reports in 
 your experiments .

List of articles


Conjointly is an all-in-one survey research platform, with easy-to-use advanced tools and expert
support. It gives you access to millions of survey respondents and sophisticated product and
pricing research methods.
Ready to answer your questions: [email protected]

  
Research Tools Concept Testing
Survey Tool Product Concept Test
Video Response Product Variant Selector
Conversational Survey Product Description Test
Claims Test Idea Screener
Claims Combination Test Package Test

Conjoint Analysis Brand Testing


Generic Conjoint Business Name Evaluator
MaxDiff Analysis Brand Name Test
Brand-Specific Conjoint Product Name Test
Gabor-Granger Pricing Method Logo Test
Van Westendorp Price Sensitivity Meter Domain Name Likeability Check
TURF Analysis Simulator Business Card Design Test
Brand-Price Trade-Off Graphic Design Feedback

Monadic Testing Feature and Pricing Suite for SaaS


A/B Test Kano Model
Monadic Test Feature Placement Matrix
Video Test Feature Placement Simulator
Feature Placement Validator
Ad Pretesting
Image Ad Test
Ad Copy Test
Print Ad Test List of articles
Out-of-Home Ad Test


Out-of-Home Video Ad Test

Expertise Respondents
Pricing research Self-serve sample
Claims and messages testing Predefined panels
Product feature selection
Industries
Concept testing Consumer goods
Range optimisation MedTech and Pharma
Usage and attitude Marketing agencies
Customer satisfaction survey Feature and Pricing Suite for SaaS
Voice of the customer Social enterprise
Full-service research projects
Resources
Brand testing
What is Conjoint Analysis?
Ad pretesting
How It Works
Implicit testing
Guides
Dial testing
Blog
Typing tools
API
Insights Explorer
Company
About us
Careers

For legal and data protection questions, please refer to our Terms and Conditions, Cookie Policy
and Privacy Policy.
Conjointly is committed to embracing sustainable energy solutions to promote renewable energy
and energy efficiency.
Conjointly is the proud host of the Research Methods Knowledge Base by Professor William M.K.
Trochim.

Ready to answer
your questions
List of articles

  Request a consultation
© 2025 Analytics Simplified Pty Ltd, Sydney, Australia. ABN 56 616 169 021

List of articles

You might also like