Prepared by
Hill+Knowlton Strategies
June 11, 2019
Vendor Performance
Management (VPM)
What we heard
A summary of stakeholder engagement
© Hill+Knowlton Strategies
Hill+Knowlton Strategies (H+K) was retained by Public Services and Procurement Canada (PSPC) to provide
support in undertaking this engagement project. H+K developed the agenda for the in-person roundtables and
WebEx sessions, and facilitated these discussions in collaboration with PSPC; PSPC selected and invited
participants. A separate online survey was developed and launched by PSPC. H+K’s role was to analyze and
report on data collected through the in-person roundtables and webinars (and not the online survey). PSPC
reviewed draft versions of this report and provided H+K with written feedback, which was incorporated into the
final product.
© Hill+Knowlton Strategies
Table of Contents
Key findings _____________________________________________________________ 1
Executive summary ______________________________________________________ 2
Introduction 2
Overview 2
Building a VPM Policy 3
Key Performance Indicators 4
Introduction + Context _____________________________________________________ 5
Context 5
Engagement 6
Outreach 7
Detailed findings _________________________________________________________ 8
Opportunities 8
Challenges 9
Building a VPM policy ____________________________________________________ 11
Application of a VPM Policy 11
Vendor Performance Evaluations 12
Other Considerations 14
Key performance indicators ________________________________________________ 15
Quality 15
Cost 15
Schedule 16
Management 16
Appendix A Consultation Events and Locations _______________________________ 18
Appendix B Participant Worksheets ________________________________________ 19
1| © Hill+Knowlton Strategies
Overall, stakeholders who participated in the
consultations agreed that a Vendor Performance
Management (VPM) regime could motivate
vendors to perform well and be a productive
management tool for government, but there
were concerns it could disadvantage new
vendors without a history of commerce with the
federal government, be resource intensive and
impose additional costs on vendors and
contracting authorities.
Stakeholders outlined a number of key policy
decision points for a VPM regime, including the
following.
Application
Most participants believed a VPM regime should
apply to all government contracts for all goods
and services groups, but were split on the
appropriate financial threshold. Almost half of
participants said it should apply to contracts of
any amount, while others said it should apply
only to contracts worth $100,000 or more,
including thresholds above $100,000 (e.g.$1-
million and above). A minority said it should
apply to contracts below $100,000, including
thresholds of $25,000 plus.
Frequency
Some participants favoured interim results every
six months (or at the mid-way point for contracts
shorter than six months) while others preferred
every 12 months or only at contract close-out. In
all cases, participants said the impact of frequent
evaluations on the workload of the technical
authority (TA) must be taken into account.
Calculation
To calculate a vendor’s rating, some participants
preferred a weighted average of all of the
vendor’s final and interim scores, some
suggested using the vendor’s final scores and
most recent interim scores, and others wanted to
use final scores only. Participants also lacked a
clear consensus about how to score new
vendors during the bid evaluation process, but
slightly favoured the provision of “3” as a default
score for new vendors.
Responsibility
The contracting authority (CA), or a combination
of the CA and the TA should be responsible for
conducting and communicating the results of
vendor performance evaluations, with the CA
taking a lead role in overseeing and validating
the evaluations.
Appeals
The appeals process should be at arm’s length
from PSPC, either as an independent appeals
organization or a combination of an independent
appeals organization and/or a senior
management committee of Public Services and
Procurement Canada (PSPC).
Participants also proposed a number of Key
Performance Indicators (KPIs), including: the
quality of vendor service post-contract; use of
cost savings or innovative approaches by
vendors; the delivery of the contract under
budget or ahead of schedule; the accuracy of
cost forecasting, or cost containment; the
successful completion of key milestones and/or
delivery dates; and, the frequency and quality of
vendor communications with CAs and TAs.
Key findings
2| © Hill+Knowlton Strategies
Introduction
The Government of Canada, which currently
purchases $22 billion worth of goods and
services annually from approximately 37,500
suppliers, wants to reinforce good vendor
performance and position itself as a buyer of
choice. As a result, Public Services and
Procurement Canada (PSPC) is leading the
development of a VPM regime that would
encourage good vendor performance, hold
vendors accountable for poor or unacceptable
behaviour, and create a repository of past
vendor performance.
PSPC has drafted a proposed VPM Policy that
would include four performance indices (Quality,
Cost, Schedule, and Management) under which
KPIs, specific to the group of goods or services,
would be evaluated using a five-point scale. To
gather input from industry and government
stakeholders on the key elements of the
proposed policy, PSPC held regional, in-person
consultations in the six PSPC regions of Pacific
(Vancouver), Western (Edmonton), Ontario
(Toronto), National Capital Region
(Ottawa/Gatineau), Quebec (Montreal) and
Eastern (Halifax) in March 2019. Two WebEx
consultations for stakeholders unable to attend
the regional in-person sessions took place in
early April 2019.
Overall, stakeholders who participated agreed
that a VPM regime could motivate vendors to
perform well and be a productive management
tool for government, but there were concerns it
could disadvantage new vendors without a
history of commerce with the federal
government, be resource intensive and
impose additional costs on vendors and
contracting authorities.
Overview
Several benefits of a VPM regime were
identified, such as encouraging vendors to
provide added value and innovative solutions. It
was noted by government and industry
stakeholders that VPM could help to improve
communication between vendors, contracting
authorities (CAs) and technical authorities (TAs),
help set expectations on both sides, and provide
constructive feedback for vendors. Industry
stakeholders also indicated that VPM could help
ensure issues related to expectations for the
contract and to poor performance are raised
before the end of the contract, when it is already
too late. Several industry stakeholders in the
Western region touted VPM as a step in the right
direction towards the adoption of QBS
(Qualifications-Based Selection) style
procurement that emphasizes the qualifications
of a vendor’s bid over pricing. According to
government stakeholders, VPM would enable
CAs and TAs to access vendor past performance
information to inform future procurement
decisions.
Participants also identified several potential
challenges. For industry stakeholders,
particularly from small and medium enterprises,
potential challenges include the creation of
barriers for new vendors who do not have a
performance rating when applying for
government contracts, and the imposition of
additional administrative burdens and red-tape,
including the potential for additional costs.
Among government stakeholders, it was outlined
that a VPM regime could similarly result in an
additional administrative burden and add to the
workload of CAs and TAs who would be
responsible for completing vendor performance
evaluations and the maintenance of the central
vendor performance database. Stakeholders
from the National Capital Region (NCR), in
particular, emphasized the need for clarity
around established roles and responsibilities for
government stakeholders under a VPM regime.
There were concerns about how a VPM regime
could be applied in certain circumstances, such
as in rural or remote communities that lacked a
Executive
summary
3| © Hill+Knowlton Strategies
large pool of vendors, for sole-source contracts
or where a vendor subsequently was part of a
merger or acquisition. Some government
stakeholders with experience of a VPM regime
noted their hesitancy to record negative
evaluations because of the potential for the
evaluations to be accessed under Access to
Information or similar federal rules.
Stakeholders agreed that communication would
be important to ensure the success of any VPM
policy, with vendors informed early on in the
contracting process about key performance
indicators (KPIs). They agreed that there should
be regular check-ins and communications
timelines set out in the policy to avoid
misunderstandings and potential delays.
The evaluation process should be flexible to
allow vendors to work on their performance and
improve their performance score, with the results
of the evaluation process disseminated across
government departments to amplify learnings
and clarify any misunderstandings.
Building a VPM Policy
Participants in the consultations were asked to
create their own VPM Policy, incorporating key
parts of the VPM Process.
Three quarters felt the policy should be applied
to all government contracts for all goods and
services groups, but were evenly divided
between applying it only to contracts in excess of
$100,000 (including thresholds above $100,000
such as $1-million and above) or to contracts of
any amount. A minority of participants proposed
the application of VPM to contracts under the
$100,000 threshold. Many participants who felt it
should apply to all contracts supported a simpler
VPM process for smaller-value projects. One
quarter believed the regime should be applied
only to government contracts for certain goods or
services groups, contingent on characteristics
such as risk or location.
Participants differed on when vendors should be
provided with interim evaluation results. The
largest number favoured reports every six
months or, for contracts shorter than six months,
at the mid-way point of the contract and at close-
out. Others preferred reports every 12 months
(or only at contract close-out), at other intervals
such as contract milestones, or at the discretion
of the CA. Some said the timing of reports should
be dependent on the nature of the goods or
services provided. Whatever interval was
chosen, it would be important to consider the
impact frequent evaluations would have on the
TA’s workload.
The most often proposed method for calculating
a vendor’s performance rating at contract close-
out would be to use a weighted average of all the
vendor’s final and interim scores. Some
participants preferred using the vendor’s final
scores and most recent interim scores, or just
the vendor’s final scores. The CA should have
primary responsibility for initiating and
communicating the results of evaluations, while
the TA would conduct the evaluations in most
cases. Participants underscored the importance
of appropriate technical knowledge in the
execution of performance evaluations.
Most participants agreed that vendor
performance ratings should be used as a
weighting in future contract evaluations, along
with price and, where applicable, technical
compliance. Many felt that the fairest way for
new vendors and existing bidders to compete
would be to assign any bidder without a valid
performance rating a default score of “3. A
similar number of participants also noted that
newer vendors could be assigned a score that is
the average of the scores recorded in the
database, or reallocate the points to other
evaluation criteria (financial and non-financial)
proportionally.
Discussions about the most appropriate appeals
process revealed considerable differences
among participants, with most evenly split
between an independent appeals organization
and a combination of an independent appeals
organization, an executive of the CA
organization and/or a senior management
committee of PSPC. Whatever process was
selected, participants emphasized it should be at
arm’s length from PSPC, particularly if vendor
4| © Hill+Knowlton Strategies
performance scores are used in bid solicitation.
According to stakeholders, the issue of whether
a vendor’s score would apply during the bid
evaluation process while they are undergoing an
appeal needed to be clarified.
Key Performance Indicators
Participants were asked to brainstorm KPIs and
corresponding metrics. The results corresponded
with the four performance indices (Quality, Cost,
Schedule, Management) proposed for the VPM
regime. Generally, the participants indicated that
KPIs should be specific to the group of goods or
services.
The most discussed KPIs related to quality
include vendor post-contract servicing; quality of
the final product or deliverable; change orders or
contract deviations; and adherence to items in
the Statement of Work (SOW) or Request for
Proposal (RFP). It was acknowledged that
developing objective quality indicators may be
easier for goods and certain services, such as
construction, than for other services.
Participants often outlined cost savings or
innovation related KPIs, including delivering the
contract under budget or ahead of schedule, the
accuracy of cost forecasting, or cost
containment. Some participants were wary of
including any cost-related KPIs, noting that cost
would already have been evaluated during the
bid solicitation process, or may not be relevant
for fixed cost contracts.
The most frequently identified schedule KPIs
measure whether key milestones and delivery
dates have been met. Other schedule KPIs could
be the use of progress reporting and
communication with CAs and TAs about potential
delays or the need for extensions (this is also
related to management). Some participants
believed there should be separate schedule KPIs
for goods and services, with those for goods
based on contract terms and those for services
based on percent milestones or deliverables met.
The frequency and quality of vendor
communications with CAs and TAs was often
cited as a management KPI, along with the
resolution of claims or work after contract close-
out and elements of a vendor’s financial
management, such as invoice accuracy. Some
participants expressed concerns about the
subjectivity of management KPIs, suggesting
instead that they be included in the “Quality”
index.
5| © Hill+Knowlton Strategies
Context
The Government of Canada is aiming to
transform the way it conducts business in order
to reinforce good performance and position itself
as a buyer of choice. To achieve this, Public
Services and Procurement Canada (PSPC) is
leading the development of a Vendor
Performance Management (VPM) regime. This
transformational procurement initiative will
provide a framework for evaluating the
performance of vendors in federal contracting
and will eventually be used to inform future
contract award decisions.
While the VPM Policy will at first only apply to
PSPC administered contracts, the goal is to
eventually have it apply to all applicable
contracts issued by Government of Canada
departments and agencies. The VPM regime’s
key goals are to:
Optimize best value through consideration
of performance in the award of federal
government contracts;
Facilitate open, ongoing communications
and relationship building between
government and vendors;
Improve vendor performance by holding
vendors who do not perform to account
while encouraging good performance;
Promote public confidence, the
accountability of public funds, and
responsible partnerships.
The development of VPM comes in response to
the Minister’s mandate of improving government
procurement practices, and various reports by
the Office of the Procurement Ombudsman
(OPO) on concerns around the government’s
ability to manage the performance of its vendors.
There is currently limited tracking and direction
on the management of vendor performance
across government departments.
The Government of Canada procures over $22
billion in goods and services annually through
approximately 37,500 suppliers. By tracking the
performance of suppliers, the Government of
Canada will be able to build on the successes of
other jurisdictions already employing VPM and
measure the performance of its vendors.
The proposed VPM model for the Government of
Canada features a number of key components,
including:
Standardized approach to performance
evaluations;
Use of vendor performance ratings in bid
selection;
Centralized repository of vendor
performance ratings;
Encouragement of vendors to provide
value-added and innovate when possible;
Improved identification of underperforming
vendors;
Ongoing communications, including about
performance expectations;
Principles of openness, fairness, and
transparency supported throughout.
As part of the proposed VPM regime, a rating
system consisting of a five-point scale and four
performance indices have been developed as
outlined below. The five-point scale would rate
each Key Performance Indicator using a scale
from “1” to “5”, where “3” is the expected level of
performance.
Introduction
+
Context
6| © Hill+Knowlton Strategies
Table 1 Proposed performance indices and scale
Performance indices
Quality
The vendor’s effectiveness in supplying deliverables of the required quality, in
conformance with the contract.
Cost
The vendor’s effectiveness in forecasting, controlling and managing contract cost,
in conformance with the contract.
Schedule
The vendor's effectiveness in maintaining the schedule for the completion of the
contract, task orders, milestones, delivery, and administrative requirements, in
conformance with the contract.
Management
The vendor’s effectiveness in integrating and coordinating all activities needed to
execute the contract, including client-focused behaviour, collaboration,
cooperation and issue resolution, in conformance with the contract.
Performance scale
Rating
Definition
Exceptional (5)
The vendor’s performance greatly exceeds the expected performance.
Surpassed (4)
The vendor’s performance exceeds the expected performance.
Achieved (3)
The vendor’s performance meets the expected performance.
Moderate improvement
required (2)
The vendor’s performance is below the expected performance.
Significant improvement
required (1)
The vendor’s performance is significantly below the expected
performance.
KPIs will be developed for each group of goods
and services. These KPIs will be used to create
a scorecard for each vendor’s performance. The
scorecard will also provide quantitative or
qualitative parameters for each of the levels on
the 1 to 5 scales for each KPI. Vendors will be
evaluated at the end of the contract and at a
frequency yet to be determined (typically every
six months) throughout the lifecycle of the
contract.
Engagement
To elicit input from industry and government
stakeholders on key elements of the proposed
VPM policy, regional consultation sessions were
held between March 4, 2019 and March 28,
2019 in the six PSPC regions: Pacific
(Vancouver), Western (Edmonton), Ontario
(Toronto), National Capital Region
(Ottawa/Gatineau), Quebec (Montreal) and
Eastern (Halifax). Thirteen in-person sessions
were held in total: two consultation sessions
were held in every region, except Quebec, with
industry and government sessions held
separately; three sessions were held in Quebec,
with two industry sessions (English and French)
and one government session. Participating
stakeholders included representatives from
municipal, provincial and federal governments
(mostly federal government), along with
7| © Hill+Knowlton Strategies
representatives from industry associations and
government vendors. Each session focused on
either government or industry participants in
order to facilitate open and honest participation
from both sets of stakeholders. Two additional
WebEx consultations took place at the end of the
consultation period to gather input from
stakeholders unable to attend the regional in-
person sessions. A list of consultation events
and locations is provided in Appendix A.
Following a context-setting presentation
delivered by a PSPC representative, participants
with direct VPM experience were asked to
outline their experiences with a VPM approach,
while those without such experience were
encouraged to share possible opportunities and
challenges of a VPM policy. Participants were
then asked to discuss and identify key decision
points of the VPM policy and brainstorm potential
KPIs for use in a new policy. Participants were
directed to complete the activities at their tables
with a table-based worksheet (Appendix B).
This report summarizes the input received from
participants and notes recorded by an observer.
In addition to the in-person and WebEx
consultation events, feedback received in the
form of two formal submissions submitted to
PSPC in response to the Buyandsell.gc.ca
Request for Information (RFI) posting of the draft
VPM Policy also form part of this report.
Outreach
The Strategic Policy Sector (SPS) of PSPC
invited approximately 3,800 participants
proposed by the regional leadership of the Office
of Small and Medium Enterprises (OSME),
through Eventbrite. In some regions, potential
participants were identified using the Supplier
Registration Index, as well as through existing
groups (e.g., Client Advisory Board, Regional
Executive Committee, Supplier Advisory
Committee, and the Supplier Advisory
Committee’s Subcommittee on VPM).
To further promote the WebEx and regional
consultation sessions, and distribute related
documentation, SPS posted an RFI on
Buyandsell.gc.ca. The regional and WebEx
consultation sessions were also advertised on
the Buyandsell.gc.ca events calendar page. In
addition, all invitees to the regional consultation
sessions were informed about the WebEx
sessions in case they weren’t able to attend the
in-person consultations.
OSME regional offices further supported various
outreach efforts, including in some cases:
Circulation of posters by OSME and SPS
to various suppliers and industry
associations (e.g., the Environmental
Services Association of the Maritimes,
the Atlantic Canada Aerospace and
Defence Association, the Consulting
Engineers of Nova Scotia);
Posting notice of the consultation
sessions on regional OSME pages; and,
Circulation to various networks, including
government procurement, senior
executive, management, federal-
provincial-territorial, and Municipal,
Academic, Schools and Hospitals
(MASH) networks.
Stakeholders invited to the consultations
included: procurement professionals and project
managers from PSPC and various other federal
government departments and agencies;
municipal and provincial government contacts;
and, vendors, including diverse vendor groups
(e.g., women-owned, Indigenous-owned, and
small and medium companies), and industry
associations. In total, 191 stakeholders
participated in the consultation sessions.
8| © Hill+Knowlton Strategies
Opportunities
Government and industry stakeholders
highlighted how a VPM policy could motivate
vendors to perform well. If vendors understand
the VPM approach, they may be encouraged to
provide added value and innovative solutions.
According to industry stakeholders, under a VPM
regime, vendors could be rewarded for good
performance with non-financial incentives.
Conversely, government stakeholders indicated
a VPM regime could provide clear expectations
to vendors about the results of poor or subpar
performance, including the use of debarment or
penalties.
Industry stakeholders suggested that a VPM
regime could lead to the consideration of a
vendor qualifications model, such as the QBS
method, that prioritizes vendor qualifications over
price. Stakeholders suggested this could be
facilitated by the utilization of previous vendor
performance information in the bid solicitation
phase of the procurement process and could
help prevent a “race to the bottom” in terms of
vendor pricing and quality.
Stakeholders from industry and government
touted VPM as a potential mechanism for
improving communication and setting
expectations in the procurement process.
Government stakeholders remarked that a VPM
policy may prevent “things [from] going
sideways” in the contracting process and could
serve to improve vendor performance through
ongoing positive and constructive feedback,
such as the evaluations and check-ins mandated
by a VPM policy. Many participants, especially
industry representatives, stated VPM should also
include 360 feedback, as they felt 360
evaluations could serve as a means of
disseminating “lessons learned” and fostering an
environment of mutual trust between government
authorities and vendors. However, industry
stakeholders cautioned that, in order for a VPM
policy to facilitate better contract management,
government authorities need to include “face-to-
face” communication and site visits where
possible: “Tough to build a relationship and have
two-way communication when there is a virtual
relationship or general mailbox.”
Stakeholders generally agreed that outlining the
requirements and performance metrics (i.e.
KPIs) will produce clarity and direction for
vendors. This will allow vendors to plan their
business needs around the criteria for contracts
and understand the implications of not meeting
established KPIs. Stakeholders noted it is
important to outline these expectations and KPIs
early, i.e. at the RFP stage, in order to facilitate
good performance, prevent disputes, and allow
vendors the opportunity to improve areas of
concern.
Government participants in Edmonton and
Halifax who had previous experience with a VPM
regime in their jurisdiction spoke to the benefits
of a VPM regime in producing constructive
feedback for vendors and as a productive
management tool for government. In particular,
they noted the suitability of a VPM regime for the
construction and professional services industries.
Stakeholders noted that the development of a
centralized repository of vendor performance
data would make it easier for government
contracting authorities to access vendor
information and inform future procurement
decisions, such as through a pre-qualification
process for contracting decisions.
Other opportunities that a VPM policy may
produce, as outlined by participants, include:
Training, particularly for government
employees involved in procurement, to
effectively understand their roles and
responsibilities;
Ensuring a defensible approach, through
the use of third-party bodies and recourse
mechanisms;
Detailed
findings
9| © Hill+Knowlton Strategies
A necessary constraint on changing
expectations after contract award; and,
The recognition and incentivization of
vendor innovation.
Challenges
Participants cautioned that new vendors may be
disadvantaged by a VPM regime, including
barriers associated with not having a history of
past performance established with the
Government of Canada. For instance, it was felt
by some that assigning a score of 3 to new
entrants during the bid solicitation process may
disadvantage these vendors if other vendors in
their goods or services group routinely achieve
higher scores. A VPM policy may present other
barriers for newer vendors, including an
increased administrative burden or additional
financial pressures.
Moreover, government and industry stakeholders
discussed how the implementation of a VPM
regime could prove to be resource intensive for
both established vendors and government
authorities. For instance, vendors may incur
additional costs associated with performance
reporting or ensuring they are in line with newly
created KPIs. For government authorities, there
could be additional responsibilities placed on
Contracting Authorities (CA) or Technical
Authorities (TA) to track and complete vendor
performance evaluations. Additionally,
stakeholders noted government authorities need
to consider the resources required to maintain a
central vendor performance repository.
Participants spoke at length about the
importance that communication would play in
ensuring the success of a VPM policy.
Government stakeholders acknowledged that
vendors must be informed about KPIs early in
the contracting process to ensure they are aware
of the evaluation criteria and about any
performance issues. Industry stakeholders noted
that this includes adhering to regular check-ins
and communication timelines set out in the VPM
policy, to avoid misunderstandings and potential
delays. However, there were concerns raised
about the current state of communications, and
the additional impact that a VPM policy would
have on the relationship between vendors and
procurement officials.
Stakeholders outlined several challenges facing
the evaluation of vendors under a VPM regime.
Private sector representatives from the
professional services industry noted the difficulty
in adhering to potentially “subjective”
measurement standards and a potential issue
around the consistency of application of
evaluation standards. Industry stakeholders were
also wary that evaluation criteria could be used
to penalize vendors for aspects outside of their
control, such as the addition of change orders by
government authorities, and reinforced the
importance of being open and upfront about
evaluation criteria so that businesses can plan
and adjust. To that end, vendors recommended
that subject matter experts should be involved in
the completion of vendor performance scores.
This would account for the unique differences
between goods and services groups, and
between industries, and provide a level of
consistency to the evaluation process.
A similar sentiment was provided by government
representatives who were wary about the
qualifications of CAs and TAs to evaluate vendor
performance. Representatives who had previous
experience with a VPM regime noted their
hesitancy to record negative evaluations, out of
concern third parties could obtain them through
an access to information request. Other
government representatives voiced
apprehension around penalizing vendors who
are performing poorly. They described
government’s current lack of enforcement
capabilities and outlined concerns that penalties
or poor ratings for poor performers could be
difficult to defend. Similarly, industry
stakeholders were concerned about receiving a
negative evaluation because of factors outside of
their control (e.g., change of scope or
requirements by the CA).
Government representatives were, however,
aware that the evaluation process (including the
calculation of vendor performance ratings) needs
to be flexible and allow for vendors to work on
10| © Hill+Knowlton Strategies
their performance and thus improve their
performance score: “Possibility for reset -
improvements post fail.”
Representatives with direct VPM experience also
spoke about the importance of disseminating the
results of evaluations across government
departments in order to amplify learnings and
clarify misunderstandings. This was reinforced
by a formal submission, which highlighted the
impact of VPM on government procurement
culture, including the potential for resistance to
VPM adoption.
According to some government stakeholders,
there may be situations where a VPM regime
should not be applicable to the contracting
process. For instance, in a rural or remote area,
there may not be a sufficient number of vendors
available, therefore it would be difficult to apply
evaluation standards.
Other issues identified by stakeholders that may
impact evaluation include: staff turnover; delays
on security checks for vendors; fear of reprisal or
dispute from a vendor in the case of a poor
performance score; impeding innovation if
vendors are forced to follow specific evaluation
criteria; concerns around the establishment of
groups of goods and services; and, the possible
lack of independence of the appeals process.
11| © Hill+Knowlton Strategies
During the second activity of the sessions,
participants were asked to create their own VPM
Policy, using a table worksheet that featured key
parts of the VPM process. These included: the
type of government contracts that could be
subject to VPM; the frequency and use of interim
and final vendor performance evaluation scores;
responsibility for conducting vendor performance
evaluations; the process for new bidders without
a valid performance rating; and, the appeals
process. Participants were also asked to explore
how government could strengthen its relationship
with vendors and incorporate their feedback.
Note: Graphs in this section reflect the
responses of consultation participants (see
Appendix A).
Application of a VPM Policy
Most participants indicated that VPM should be
applied to all government contracts for all groups
of goods or services. However, they were divided
when discussing the monetary threshold. Many
participants indicated that VPM should only apply
to contracts over $100,000, while a similar
proportion of participants preferred the
application of VPM to contracts of any amount.
However, participants indicated that under this
approach, not all contracts should receive the
same level of oversight for example, smaller
projects could employ a simpler application of
the VPM regime. Applying a VPM policy to all
government contracts, especially at the early
stages of a VPM policy, could also help develop
the organizational profile of a diverse number of
vendors and enable government contracting
authorities to learn from the experience.
Table 2 Application of a VPM policy (value)
Other includes:
Reflective of the
industry
Depends on the good
or service
Depends on risk of the
contract
Depends on sensitivity
of the contract
Tie to trade policies
In accordance with tax
policies
A lesser number of participants indicated VPM
could be applied selectively to certain goods or
services groups. This could include the selective
application, contingent on characteristics such as
risk (e.g., a building restoration project that is
relatively low cost, but is high-profile), good or
service group (e.g., all goods and services
except specific growth industries) or location.
According to participants, the application of VPM
could also depend on the level of effort required.
For instance, a complex project with a high value
should be subject to more performance oversight
compared to a lower valued project with less
complexity. This was a notion supported by a
formal submission, which called for the increased
management of strategic or transformational
goods (e.g., contracts essential for core
government services) over “those that offer
transactional or commoditized goods or
services.” Another formal submission received
similarly proposed that government utilize a
segmentation analysis to identify vendors that
require a higher level of interaction under a VPM
regime.
3.4%
5.2%
31.0%
10.3%
22.4%
27.6%
Between $0 - $25 K
Between $25 K - $100 K
Between $100 K - $1 M
Over $1M
At any amount
Other
Building a
VPM policy
12| © Hill+Knowlton Strategies
Table 3 Application of VPM policy (good/service type)
Vendor Performance Evaluations
Participants were asked to outline at what
interval they felt vendors should be provided with
feedback. Most participants indicated that
Contracting Authorities should provide interim
evaluation results to vendors every six months.
In the case of contracts shorter than six months,
it was indicated that vendors could be provided
with results mid-way and at project close-out.
Table 4 Frequency of Vendor Performance Evaluations
Stakeholders also indicated that interim
evaluations could be held quarterly, and that the
frequency of evaluations could depend on the
length of contract, value of the contract or at the
discretion of the CA. Stakeholders noted that
results could be provided at key contract
milestones, and indicated that it is important to
consider the impact of frequent evaluations on
the workload of the TA and CA.
Participants suggested that the vendor’s overall
rating (which will be used for bid evaluations)
should be calculated using a weighted average
of all of that vendor’s final and interim contract
scores. Participants who preferred this approach
highlighted that interim contract scores could
incentivize vendors to perform well throughout
the contract, and not just at the end of the project
lifecycle. However, some participants indicated
that the overall ratings should be based only on
final contract scores, noting that this would
provide vendors the “opportunity to improve” on
issues that may arise during the lifecycle of the
contract. Other participants advocated for
increased flexibility in the process noting the
choice should be “all up to the [TA].”
Participants raised questions around the use of
performance evaluations for sole source
contracts and in cases of mergers or
acquisitions.
Table 5 Calculation of overall performance ratings
All groups of
goods or
services
76%
Only certain
groups of
goods or
services
24%
Every 6
months
38%
Every 12
months
15%
Only at
contract
close-out
10%
Other
37%
41.7%
29.2% 29.2%
Final and interim
scores
Final scores and
most recent
interim scores
Final scores
Other includes:
Contract milestones
At the discretion of the CA
At 25% and 50% of contract completion
Depends on the good or service group
13| © Hill+Knowlton Strategies
The vast majority of participants indicated that
vendor ratings should be used as a weighting in
future contract evaluations, along with price
and/or technical compliance. Among those
participants, it was indicated that using the
scores as a mandatory criterion or screening tool
may be too rigid and could screen out potentially
suitable vendors. Other comments from
participants featured a mixture of all uses,
including as a mandatory screening criterion, as
a weighting in evaluations and, in the case of
exceptionally poor performance, in the
debarment of vendors. In a QBS based
procurement system, the scores could be used in
place of pricing during the procurement
screening process.
Table 6 Use of overall performance ratings in contract
award decisions
According to participants, the primary
responsibility for overseeing and communicating
the results of vendor performance evaluations
should lie with the CA. Since the TA has the
closer working relationship with the vendor, it
was noted that they should conduct the
evaluation in most cases, with the CA holding the
final say and the responsibility of communicating
the final results to the vendor. However, it was
indicated that both the CA and TA require
technical knowledge in order to complete an
objective evaluation, and that the clarification of
roles and responsibilities is important in order to
avoid misunderstandings during the evaluation
process. To increase objectivity, one participant
called for increased clarity around the definition
of the performance scale.
Table 7 Responsibility for conducting and
communicating the results of vendor performance
evaluations
If a bidder does not have a valid performance
rating on file under this policy (e.g., a new
vendor), then participants indicated their bid
could be scored by using a default score of “3.”
Participants noted that providing a neutral score
was the fairest way for newer vendors and
existing bidders to compete for government
contracts. However, other participants instead
preferred a default score that is the average or
median of the ratings recorded in the database
for that industry or for that group of goods or
services. Some participants suggested a
different approach where performance is not
considered at all for new vendors (instead
relying exclusively on price and technical
aspects), or utilizing VPM for “bonus points” in
contract award decisions. However, concerns
were also raised about this approach, including
the potential that only the lowest-priced bidders
would be considered or that it could provide
newer entrants with an advantage over
established vendors during bid evaluation.
As a mandatory
criteria/screening
6%
As a weighting in
evaluations, along with
price and/or technical
compliance
76%
For debarments
and/or suspensions
8%
Other
benefit/consequence
10%
31.4%
7.8%
60.8%
The Contracting
Authority
The Technical
Authority
Both
Other includes:
On a QBS procurement system
Only as a mandatory criterion for risky projects
A combination of mandatory criterion and as a
weighting
14| © Hill+Knowlton Strategies
Table 8 Scoring of new vendors during bid evaluation
When discussing the responsibility for the
performance evaluation appeals process, many
participants indicated that vendors should be
able to appeal to an independent appeals
organization or to a combination of the options
(i.e. multi-level). Many participants, who
indicated that the appeals process could be
multi-level, described how the CA could handle
the first level of appeal, with the vendor having
the opportunity to appeal further to an
independent organization or a senior
management committee. Participants
emphasized the need for independence in the
appeals process (i.e. at arm’s length from
PSPC), especially if vendor performance scores
are being used in bid evaluations, including a
recommendation for an increased role for the
Office of the Procurement Ombudsman. Other
comments included the need for clarification on
the ability of vendors to bid on contracts during
the appeals process.
Table 9 Responsibility for the appeals process
Other Considerations
To strengthen relationships with vendors, most
participants highlighted the need for effective
and ongoing communication between
government contracting authorities and vendors,
particularly following the implementation of a
VPM regime. Participants outlined a number of
mechanisms that government has available at its
disposal, including the local OSME team, the
design of RFP specifications, and kick-off
meetings. Participants also supported providing
feedback to vendors when framed in a
constructive way and taking into account vendor
issues, such as lengthy delays in security
screenings, or following through on the
evaluation process.
Similarly, many participants indicated that
engaging vendors is important in order to
understand vendor feedback and learn from
lessons. According to participants, vendor
engagement could take the form of a formal
event or check-in (e.g., similar to this
consultation process), 360-degree evaluation for
vendors to provide feedback on government
contracting, or regular and informal
communication.
19.6%
15.7%
29.4%
29.4%
5.9%
An executive of the contracting
authority organization
A senior management
committee of PSPC
An independent appeals
organization
A combination of the above
(i.e., multi-level)
Other
15| © Hill+Knowlton Strategies
Each table was tasked with brainstorming KPIs
and corresponding metrics at their tables. The
KPIs developed by participants correspond to
the four performance indices proposed for the
VPM policy: quality, cost, schedule, and
management. Proposed KPIs are summarized in
Table 10.
Quality
The KPIs most discussed by participants
pertained to the quality of vendor service post-
contract, e.g., how often a warranty is used after
the delivery of a good such as furniture, the
durability of a project, or maintenance and
operational costs. Participants also frequently
outlined KPIs pertaining to the quality of the
deliverable, including whether the vendor met
the specifications of the RFP or contract,
compliance with technical requirements, or
whether the vendor provided added value or
innovative solutions. To that end, it was
suggested that the development of KPIs around
quality should include input from the end users of
the product or service delivered, such as through
a client satisfaction survey.
Other KPIs outlined by participants included the
number of change orders or deviations from the
original RFP, including those initiated by the
government. Participants further emphasized the
importance of communication and dispute
resolution by proposing KPIs to assess the
number of complaints against the vendor, or
quality of communication with the CA or TA.
Quality could also be measured using KPIs that
assess auxiliary impacts of the contract,
including community benefits, the environmental
footprint, and the effective resolution of disputes
with the client team.
Participants acknowledged that it may be easier
to develop objective quality indicators for goods
and managed services, such as construction, in
comparison to professional services, for
example.
Stakeholders were also interested in measuring
the quality of substitute or fill-in goods or
personnel, and whether they were the same
quality as the original.
Cost
For cost related measures, participants most
often outlined costs savings or innovation related
KPIs. These include the measurement of
innovative approaches that deliver cost savings
(e.g., reuse of materials), or the delivery of the
contract under budget or ahead of schedule
(e.g., the provision of incentives for delivering
under budget, “the return on the taxpayer’s
dollar”). According to participants, the
consideration of innovative or value-added ideas
is particularly important for technology contracts,
such as telecom or IT hardware or software,
where there is often drastic innovation in shorter
time frames.
Other proposed cost related KPIs measure the
accuracy of cost forecasting by the vendor,
including the consideration of lifecycle costs or
cost containment. Industry representatives
insisted that change orders, including those that
affected scope, should be measured, but were
unsure how to accurately measure these
considerations to avoid unnecessarily penalizing
the vendor if the change order was initiated by
the government. Another participant proposed
measuring the financial viability of suppliers,
particularly for suppliers that “present material
business continuity risk” and provide services
essential for core government services.
Other participants were hesitant altogether to
include cost related KPIs. They indicated that
cost would have been already evaluated during
the bid solicitation process and questioned the
utility of including it in the performance
evaluation.
Key
performance
indicators
16| © Hill+Knowlton Strategies
Schedule
Under the schedule index, participants most
frequently identified KPIs that measured whether
key milestones and delivery dates were met.
Other KPIs noted by participants included
progress reporting and communication with CAs
about potential delays and need for extensions:
How do you communicate around deadlines
when they can’t be met?” This could include the
establishment of a “corrective plan” to meet the
expectations of CAs and TAs and to ensure
minimal disruption on operations. Conversely,
other participants discussed the need for a KPI
to measure and encourage vendor flexibility and
adaptability when it comes to dealing with
government delays.
Other participants supported the notion of having
separate schedule KPIs for goods and for
services. Schedule KPIs for goods could be
based on contract terms, including the delivery or
meeting of milestones, while schedule KPIs for
services are variable and could be based on the
percent milestones or deliverables met. Another
participant outlined that a VPM policy needs to
consider how schedule or cost KPIs are
measured to avoid penalizing the vendor when
delays or cost overruns are unavoidable due to
scope changes initiated by the government.
Management
Management KPIs discussed include the
frequency and quality of vendor communications
with CAs and TAs. Participants also outlined
KPIs that correspond to the resolution of claims
after contract close-out. Many participants who
work with the construction or services industries
emphasized the measurement of a vendor’s
financial competency including the accuracy of
invoices and the timely payment of
subcontractors. Other KPIs included the
deployment of value added or innovative
solutions (e.g. “Vendor proactively and regularly
provides value added/engineering
opportunities”), the management of contract
transition, corporate social responsibility, and
workplace safety.
A formal submission outlined the importance of
risk management KPIs, such as those that
measure business continuity (e.g. the
development of continuity plans for services and
suppliers essential to the end-to-end provision of
government services), or information security
(e.g. the security of vendors handling sensitive
information or accessing government systems).
Some participants outlined concerns around the
subjectivity of management KPIs and instead
proposed they be folded under quality.
17| © Hill+Knowlton Strategies
Table 10 KPIs identified by participants, in order from most to least popular
Quality KPIs
1. Post-contract quality (e.g. warranty, after claims)
2. Quality of final product or deliverable
3. Change orders or contract deviations
4. Adherence to items in the Statement of Work (SOW)
5. Innovation or community and environmental benefits
6. Communication with government authorities
7. Client satisfaction
8. Resolution of contract issues
9. Unresolved defects or number of deficiencies (i.e. how were they
addressed, the timeline to address or meet RFP specifications)
10. Links to certifications or standards (e.g. the International Organization
for Standardization or ISO)
Cost KPIs
1. Cost savings (e.g. value-added or innovative solutions)
2. Cost overruns
3. Cost forecasting accuracy
4. Change orders
5. Operational costs (e.g. energy or maintenance costs)
6. Invoice accuracy
Schedule KPIs
1. Delivery dates met
2. Progress reporting
3. Milestones met
4. Accuracy of forecast scheduling
5. Ahead of schedule
Management KPIs
1. Open and timely communications with CA and TA
2. Professional interactions
3. Dispute resolution (including claims after contract close-out)
4. Change management (e.g. response to personnel changes and
modifications to scope)
5. Responsiveness and creative solutions
6. Financial quality (e.g. invoice accuracy and timeliness, payment of
subcontractors)
1
7. Treatment of subcontractors
8. Value added or innovative solutions
9. Corporate social responsibility and safety
10. Socio-economic benefits
1
These KPIs may also fall under the Cost index but were highlighted by participants during the discussion on Management KPIs.
18| © Hill+Knowlton Strategies
Appendix A Consultation Events and Locations
Date
Location
Stakeholder group
Number of
participants
Number of
invitees
March 4, 2019
Vancouver, BC
Government
9
123
Government
7
March 6, 2019
Edmonton, AB
Industry
9
142
Government
17
March 19, 2019
Montreal, QC
Industry (English)
11
1,122
Industry (French)
3
March 20, 2019
Government
5
March 21, 2019
NCR
Industry
25
1,265
March 25, 2019
Government
17
March 26, 2019
Toronto, ON
Industry
12
717
Government
10
March 28, 2019
Halifax, NS
Industry
8
414
Government
23
April 3, 2019
WebEx
Industry and Government
(French)
6
N/A
2
April 4, 2019
WebEx
Industry and Government
(English)
29
N/A
Total
191
3,783
2
All of the invitees from the other sessions were also invited to the WebEx sessions.
19| © Hill+Knowlton Strategies
Appendix B Participant Worksheets
Figure 1 Worksheet 1 Your Perspectives
20| © Hill+Knowlton Strategies
Figure 2 Worksheet 2 - Build Your Own Policy (Page 1)
21| © Hill+Knowlton Strategies
Figure 3 Worksheet 2 - Build Your Own Policy (Page 2)
22| © Hill+Knowlton Strategies
Figure 4 Worksheet 3 - A Closer Look at KPIs
Hill+Knowlton Strategies
www.hkstrategies.com