Framework lifecycle for developers¶
0 - overview¶
Each framework goes through a sequence of statuses that define the framework lifecycle:
status |
Description |
coming |
displays a message to users that the framework will be open for applications soon |
open |
suppliers can apply to the framework |
pending |
applications close, the reports are generated and sent to the category team |
standstill |
results sent to suppliers; successful suppliers sign and return their agreement files |
live |
supplier services are available to buyers on the Digital Marketplace |
expired |
services for the framework no longer available on the Digital Marketplace |
There are a number of steps involved in launching a new framework iteration, aside from creating the framework object itself in the database. Follow all the instructions in the Preparing to add a new framework documentation before beginning the process below.
Note
Make sure you coordinate these steps with the framework’s product manager and/or comms team before running on production. There is no ‘delete’ API method for frameworks, so proceed with caution!
1 - coming¶
The framework will be open for applications soon, and a message will be shown on the Digital Marketplace home page.
Once the setup steps in adding frameworks are complete, the new framework object can be created with status “coming”:
curl -H "Authorization: Bearer <API_TOKEN>" -H "Content-Type: application/json" -X POST -d
'{
"updated_by":"developers.email@digital.cabinet-office.gov.uk",
"frameworks": {
"slug": "g-things-23",
"name": "G-Things 23",
"framework": "g-things",
"status": "coming",
"clarificationQuestionsOpen": false,
"lots": ["cloud-hosting", "cloud-software", "cloud-support"],
"hasDirectAward": true,
"hasFurtherCompetition": false
}
}' https://<API_ENDPOINT>/frameworks
Another POST request is required to update the framework with further attributes:
the known/expected framework lifecycle dates
the
allowDeclarationReuse
flag
The allowDeclarationReuse
and applicationsCloseAtUTC
keys are used to determine if declarations can be reused
from one framework to the next. If allowDeclarationReuse
is not true, or if applicationsCloseAtUTC
is undefined, this framework will not be offered to suppliers as a valid source of answers for their declaration.
The other datetimes are used for display purposes only and do not currently affect framework state in any way (i.e.
the status is not automatically changed from live
to expired
at the time specified in frameworkExpiresAtUTC
).
Set the lifecycle dates and declaration reuse flag with:
curl -H "Authorization: Bearer <API_TOKEN>" -H "Content-Type: application/json" -X POST -d
'{
"updated_by": "developers.email@digital.cabinet-office.gov.uk",
"frameworks": {
"allowDeclarationReuse": true,
"applicationsCloseAtUTC": "2000-01-01T12:00:00.000000Z",
"intentionToAwardAtUTC": "2000-01-01T12:00:00.000000Z",
"clarificationsCloseAtUTC": "2000-01-01T12:00:00.000000Z",
"clarificationsPublishAtUTC": "2000-01-01T12:00:00.000000Z",
"frameworkLiveAtUTC": "2000-01-01T12:00:00.000000Z",
"frameworkExpiresAtUTC": "2000-01-01T12:00:00.000000Z"
}
}' https://<API_ENDPOINT>/frameworks/<FRAMEWORK_SLUG>
Be careful with the timezones; all times in the database and server are in UTC, which only coincides with GMT of the UK from the last Sunday of October until the last Sunday of March. The rest of the year the UK follows the British Summer Time (BST), which is one later (UTC+01:00).
We normally send an email to inform users of the coming framework. Agree with Category and Sourcing when and to whom to send the email.
2(a) - open (with clarification questions open)¶
While a framework is open, suppliers can make their declaration and submit services that they want to provide.
Suppliers can also submit their public clarification questions if the framework’s
clarificationQuestionsOpen
attribute is set. Answers to these clarification questions are published to all
suppliers on the Digital Marketplace.
Before opening the framework, ensure any previous frameworks that we want applicants to be able to reuse declarations from have both their
allowDeclarationReuse
flag set andapplicationsCloseAtUTC
field set.Make sure any dates in the content are confirmed.
Note
IMPORTANT: co-ordinate the launch with product/comms team before setting a framework to ‘open’.
To open the framework:
curl -H "Authorization: Bearer <API_TOKEN>" -H "Content-Type: application/json" -X POST -d
'{
"updated_by":"developers.email@digital.cabinet-office.gov.uk",
"frameworks": {
"status":"open",
"clarificationQuestionsOpen": true
}
}' https://<API_ENDPOINT>/frameworks/<FRAMEWORK_SLUG>
or:
./scripts/api-clients-shell.py <STAGE>
data.update_framework(
'<FRAMEWORK_SLUG>',
{'status': 'open', 'clarificationQuestionsOpen': True},
'developers.email@digital.cabinet-office.gov.uk'
)
The framework’s product manager regularly publishes any clarification question answers by uploading PDFs in the admin. Suppliers should be emailed after each batch of answers has been published (maximum 1 email per day). This can be done with the jenkins job
notify-suppliers-of-framework-application-event-production
.Update the
daily-stats-snapshot
Jenkins jobs for the new framework.Update the
export_supplier_data_to_s3
Jenkins job with the new framework slug. This allows the framework manager to download the supplier contact list from the admin, to email them with clarification question answers and other updates.If you opened a framework on preview or staging, run the functional and visual regression tests against that environment and fix any test failures caused by the change in framework state.
2(b) - open (with clarification questions closed)¶
After the clarificationsCloseAtUTC
date has passed, suppliers can no longer submit clarification questions (though they can still submit private questions about their application, which aren’t published).
Clarification questions must be closed manually, with:
curl -H "Authorization: Bearer <API_TOKEN>" -H "Content-Type: application/json" -X POST -d
'{
"updated_by":"developers.email@digital.cabinet-office.gov.uk",
"frameworks":{
"status":"open",
"clarificationQuestionsOpen": false
}
}' https://<API_ENDPOINT>/frameworks/<FRAMEWORK_SLUG>
Reminding suppliers to complete their application¶
One week before applications close, a developer should run the Jenkins job Notify suppliers with incomplete applications - production to send email reminders (via Notify) to suppliers with incomplete applications, listing the steps they have left to do.
Scaling up the apps¶
Traffic will increase sharply in the week before applications close. Be ready to scale up the number of instances on the Supplier frontend, the API, and potentially the router and Antivirus API apps.
See Scaling PaaS apps for information on how to do this.
3 - pending¶
Applications for the framework are complete, and suppliers can no longer edit their declaration or services.
Close applications with:
curl -H "Authorization: Bearer <API_TOKEN>" -H "Content-Type: application/json" -X POST -d
'{
"updated_by":"developers.email@digital.cabinet-office.gov.uk",
"frameworks": {
"status": "pending"
}
}' https://<API_ENDPOINT>/frameworks/<FRAMEWORK_SLUG>
Any apps that were scaled up prior to applications closing can now be scaled down to the normal number of instances.
The daily-stats-snapshot
Jenkins jobs can now be disabled.
If you have closed applications on preview or staging, run the functional and visual regression tests against that environment and fix any test failures caused by the change in framework state.
Exporting data for the category team¶
Application data for the framework now needs to be exported and sent to the category team for evaluation as soon as possible after applications close. There are some test supplier accounts in production for use by the developers and category team. These can be found in the credentials repo, and should be excluded from any reports or framework awards.
A developer should carry out the following tasks in the right order:
Ensure the framework’s
intentionToAwardAtUTC
date - and indeed the other dates - are set and correctRun the Notify suppliers whether application made for framework job on Jenkins. This will check the application status of all suppliers who showed interest in the framework and email them to say either “we got your application” or “you didn’t apply to the framework”.
Run
export-framework-applications-at-close.py
from our scripts repository to generate the list of supplier applications. This shows how far each supplier had got once they’d started their application, i.e. whether they completed their declaration and the number of services submitted/left in draft in each lot.Run the Mark definite framework results job on Jenkins to determine automatic passes, automatic fails and discretionary award statuses and set the majority of
on_framework
values in the database. The assessment schema previously generated in the frameworks repository is used here to validate the supplier declarations. Make sure the schema has been committed to the scripts repository, so that Jenkins can access it.Run
export-framework-results-reasons.py
from our scripts repository to export the results from the previous step, ready to pass to the category team. It generates three files, one each for successful, failed and discretionary results.(G-Cloud only) Run the Scan G-Cloud services for bad words job on Jenkins, specifying the ‘draft’ services option. Download the CSV report from the Jenkins box and send to the category team for review. It’s up to the category team to decide on any actions (e.g. disqualifying the supplier from the framework completely), however they may provide us with a list of suppliers or services to disable prior to the framework going live.
Finally run
export-framework-applicant-details.py
from our scripts repository to generate a separate list of all applicants including contact details from their declaration. This can be used by the category team to get in touch with suppliers to clarify things about their application when necessary.
We then wait for the category team to decide who passes and fails the discretionary applications, and which automatically-failed suppliers get on to the framework after all.
Re-assessing failed suppliers¶
The category team will supply us with a list of suppliers who should be on the framework after all, who need to be updated. This process differs depending on if the framework’s services are assessed against a schema (like we do for DOS) or not (like G-Cloud).
Note
You may want to check that the number of supplier IDs in the final pass/fail lists match with the total produced from the export script initially; while usually things are fine we have had issues before where some suppliers have been missed out and then they are unable to see their application status.
Run the
update-framework-results.py
script to set those suppliers provided by the category team ason_framework
. The suppliers declaration or services are not changed, as themark-definite-framework-results.py
will have already set their draft services statuses tosubmitted
.
Previously, DOS services were also assessed via a pass/fail schema. They are now validated during the supplier application period - a supplier can’t create a service that would ‘fail’. So the re-assessment process is now the same for G-Cloud and DOS.
Generating framework agreements¶
Make sure the relevant framework document templates are in the agreements repository and that they have been signed off by the category team.
Set the framework agreement details in the database, if this hasn’t been done already:
curl -H "Authorization: Bearer <API_TOKEN>" -H "Content-Type: application/json" -X POST -d '{ "updated_by":"developers.email@digital.cabinet-office.gov.uk", "frameworks":{ "frameworkAgreementDetails": { "contractNoticeNumber": "RM1557xxiii-v1.2-12-13-2525", "countersignerName": "Joe Bloggs", "countersignerRole": "Director - Technology", "frameworkAgreementVersion": "RM1557xxiii", "frameworkExtensionLength": "12 months", "frameworkRefDate": "32-13-2525", "frameworkURL": "https://www.gov.uk/government/publications/g-things-23-framework-agreement", "lotDescriptions": { "cloud-hosting": "Lot 1: Cloud hosting", "cloud-software": "Lot 2: Cloud software", "cloud-support": "Lot 3: Cloud support" }, "lotOrder": [ "cloud-hosting", "cloud-software", "cloud-support" ], "variations": { } } } }' https://<API_ENDPOINT>/frameworks/<FRAMEWORK_SLUG>
These framework agreement details are required to generate the contracts that suppliers will sign if they are accepted
onto the framework. Some of this information will not be available when the framework is added to the database - triple
check all the values with the category team, especially frameworkRefDate
(usually the date of the
last day of standstill but can vary).
Run
generate-framework-agreement-signature-pages.py
from our scripts repository for a single supplier ID locally to test the generated framework agreement, following the detailed instructions in the script’s docstring. The generated document should be signed off by the category team before it is sent to suppliers using the Jenkins job during standstill.
Note
If a supplier has made a mistake on their agreement details, they can re-sign, see: Supplier has made a mistake when signing framework agreement.
Uploading ‘fail letters’¶
The category team must supply result letters for all suppliers who failed to make it onto the framework. These must be PDFs uploaded to the agreements bucket with the following filename and path format:
s3://digitalmarketplace-agreements-<stage>-<stage>/<framework_slug>/agreements/<supplier_id>/<supplier_id>-result-letter.pdf
These can also be uploaded using the
bulk-upload-documents.py
script and should be done before the beginning of standstill.
Any documents submitted by suppliers as part of the application are automatically scanned for viruses via the Antivirus API.
Remember to use the production-developers
AWS account when running the upload scripts.
4 - standstill¶
Results are made available to suppliers at the beginning of standstill, along with signature pages to sign for suppliers awarded onto the framework. This is also known ‘Intention to Award’ (the actual ‘awarding’ of the framework happens at the end of standstill).
During standstill, buyers will see banner messages on the direct award project saved search page
(/buyers/direct-award/g-cloud/projects/<project_id>
), explaining how saved search behaves during the
transition between frameworks. This content relies on both frameworks repository metadata and framework dates in
the database.
Ensure that
following_framework.yml
exists in the frameworks repository metadata folder for the old framework, and that it containsslug: <following-framework-slug>
. For G9 for example, it would containslug: g-cloud-10
.Ensure that the
frameworkLiveAtUTC
is set correctly in the database, for the incoming frameworkEnsure the links to these documents in the frameworks repository at
frameworks/<frameworks_slug>/messages/urls.yml
are correct and the new version is pulled intosupplier-frontend
. Suppliers will see these links when we email them in the next step.Update the framework specific content in supplier-frontend for the
/frameworks/<framework_slug>/sign-framework-agreement
route. Ensure this is signed off by the category team.The framework status can then be set to
standstill
using:curl -H "Authorization: Bearer <API_TOKEN>" -H "Content-Type: application/json" -X POST -d '{ "updated_by":"developers.email@digital.cabinet-office.gov.uk", "frameworks": { "status": "standstill" } }' https://<API_ENDPOINT>/frameworks/<FRAMEWORK_SLUG>
Finalised versions of the framework agreement documents must now be published on GOV.UK by the framework product manager / content designer.
Upload the list of successful suppliers as a framework communication. Check that this does not include any test suppliers.
Run the Notify suppliers of Intention To Award (41) Jenkins Job to email all successful suppliers with their result. The email also contains a link to log in and sign their framework agreement. Once logged in, they can also view links to the published documents on GOV.UK.
Run the
Notify suppliers that their application has been unsuccessful
Jenkins job to email all unsuccessful suppliers.If you’ve put a framework into standstill on preview or staging, run the functional and visual regression tests against that environment and fix any test failures caused by the change in framework state.
Towards the end of standstill we send a reminder email to suppliers who have not yet signed their framework agreement by running the Remind suppliers to sign framework agreement (43) Jenkins Job.
At the end of standstill, update the job_definitions/generate_upload_and_notify_counterpart_signature_pages.yml job on Jenkins so that framework agreements start being countersigned for the framework. Ensure the job runs successfully before making the framework live.
Unsuccessful suppliers will be contacted directly by the category team with their result, so as long as we have already uploaded the PDF result letters (see ‘fail letters’ in the previous step) there is nothing more to do for them.
Preparing services to go live¶
Framework services can now be migrated from drafts to “real” services. This step is slightly different for G-Cloud and Digital Outcomes and Specialists.
G-Cloud¶
Warning
This step took approximately 10 hours for G-Cloud 11’s 31000 services. While some improvements have been made for G-Cloud 12, it’s recommended to start this task at least the morning before the end of standstill.
Run the
Migrate copied documents
Jenkins job. If a user copies a service from a previous framework, this will include references to documents on the previous framework. This job copies the documents from the previous framework’s documents bucket into the current framework’s submissions bucket and updates the draft services with the appropriate filename. Note, this job only needs to be run if service documents were copyable from the previous framework.For preview only, you need to either:
work out how to copy all the submission files in the S3 bucket from production to preview (and document it here); or
abandon this process and do it in production instead. Then copy the production database with published services back to preview.
Disconnect the destination
documents
bucket from the antivirus SNS topic so that the Antivirus API won’t get overwhelmed by the volume of documents being uploaded during the following process. This can be done by altering the terraform and applying this temporary configuration. Being disconnected from real-time scanning triggers shouldn’t prevent catch-up jobs from attempting to scan these new files at a more controlled pace overnight, so there shouldn’t be any concern about ending up with unscanned files in the bucket.Run the
Publish draft services
Jenkins job. Make sure to specify the appropriate AWS account account for the stage (production for Production and Staging, development for Preview). Otherwise there will be permissions woes when a user/admin later tries to update their documents and the webapp finds itself unable to overwrite the file (more information on this is available in the docstring of the script). This job will:copy submitted draft services from “on framework” suppliers into the services table with a new service ID
update the draft services table with the new service IDs and leave the drafts there
copy the documents from the
submissions
bucket into the livedocuments
bucket with an appropriate filename and make them readable by everyoneupdate the service data with the new public document URLs
drafts for unsuccessful suppliers are left unaltered in the database
Publishing services takes several hours! Be patient and check the Jenkins console log for any failures. The script can be re-run manually, optionally supplying a file containing draft service IDs that failed the first time around. Check the publish-draft-services.py script docstring for details.
Run
oneoff/acknowledge_publish_draft_service_updates.py
to acknowledge the (several thousand!) audit events that have just been created during the service publishing. This will stop these appearing as changes for the category team to approve in the Admin.Run a script to suspend all services of suppliers who have not signed the framework agreement and notify them, see: Suspending suppliers without an agreement.
At this point, check:
that the new services are not yet visible on the marketplace
that the public document URLs work
that the documents show the intended owner account when viewed in the AWS console
that the count of services on the new framework matches the count of submitted drafts from “on framework” suppliers
Once G-Cloud services are migrated:
Check that the search mapping for the new framework has been committed to the Search API mappings folder (it should be named something like
services-g-things-23.json
) and that the commit has been released to production. See the Search API README for more details.Run the
Create index - <stage>
Jenkins job with the new mapping, to create and populate the new index. Do not name the index as any known framework family or framework slug! It should be timestamped with the current date, egg-things-23-2019-06-31
.Re-enable the antivirus SNS topic for the documents bucket.
Digital Outcomes and Specialists¶
DOS services are not visible on the Digital Marketplace, and do not need to be indexed or scanned. There’s just one step to do:
Run the
publish-draft-services.py
script, found under theframework-applications/
directory of the scripts repo.
For DOS there are no documents to be transferred between buckets, so the publishing script will be much quicker than
G-Cloud (around 25 minutes for DOS4), and will not need any bucket names to be supplied as arguments. This also means
there are no extra audit events, so we don’t need to run the acknowledge_publish_draft_service_updates.py
script.
5 - live¶
This step differs for G-Cloud and Digital Outcomes and Specialists.
Digital Outcomes and Specialists¶
To make a new Digital Outcomes and Specialists framework live:
Run the
export-dos-*.py
scripts from the scripts repository to generate new CSVs of services for each lot.Check the CSVs with a product manager/framework owner, who can decide what data should be included/changed. In the past we have manually cleaned (not all columns of data are needed), renamed columns and checked the correctness of the data. The final CSV should match the formatting of the current version.
Upload the CSVs for each lot to the S3 bucket, following the naming conventions for previous framework iterations:
digitalmarketplace-communications-production-production/<framework_slug>/communications/catalogues
Using the S3 GUI, make the files available to be read by the public, and update the metadata for the new files to set the
Content-disposition
toattachment
.Update the API’s
DM_FRAMEWORK_TO_ES_INDEX
config setting and release the app to production, so that any buyer edits (such as clarification questions and answers) to briefs are immediately re-indexed. (Note that this config is in the Data API, not the Search API!)Check for any hardcoded instances of the framework slug in the frontend templates (e.g. Supplier FE dashboard, Admin FE home page) and make sure these are updated (or better yet, un-harcode them!)
Use the DOS-specific API endpoint to set the framework live. The endpoint will update the incoming frameworks status to
live
and set the outgoing frameworks status toexpired
. It will also migrate any existing drafts for the outgoing framework to be associated with the incoming framework. The transition between DOS frameworks for a buyer should be relatively transparent and we want them to not lose their existing drafts. Either use the API client, or use:curl -H "Authorization: Bearer <API_TOKEN>" -H "Content-Type: application/json" -X POST -d '{ "updated_by": "developers.email@digital.cabinet-office.gov.uk", "expiringFramework": <EXPIRING_FRAMEWORK_SLUG> }' https://<API_ENDPOINT>/frameworks/transition-dos/<INCOMING_FRAMEWORK>
Suspend suppliers who have not yet signed their framework agreement using the bulk suspend script. Make sure to save the output of the script for audit purposes, and you may also need to know who wasn’t suspended for the next step.
Let suppliers who have signed their framework agreements know that their services are live by running the
notify-suppliers-with-signed-framework-agreement.py
script. You will probably find at this point that suppliers who were suspended are now frantically signing their agreements; you may want to use the output of the previous step to avoid emailing these suppliers as they won’t be unsuspended until the next time the countersigning script runs.Create new empty mailing lists for each lot in Mailchimp (see Creating new lists in Mailchimp) and update the
send-dos-opportunities-email.py
andupload-dos-opportunities-email-list.py
scripts with the new list IDs:send-dos-opportunities-email.py
is called by the daily jobnotify_suppliers_of_dos_opportunities.yml
, which creates and sends mailchimp campaign to suppliers subscribed to the lot list for the latest opportunitiesupload-dos-opportunities-email-list.py
is called by the daily jobupload_dos_opportunities_email_list.yml
, which subscribes new supplier emails to the corresponding mailchimp lists
Update the following scripts, test suites and Jenkins jobs/variables to include the new DOS framework. This list is probably not exhaustive so you should double check:
- Scripts:
index_to_search_service.py
- Jenkins jobs and variables:
export_supplier_data_to_s3.yml
upload_dos_opportunities_email_list.yml
digitalmarketplace-jenkins/playbooks/roles/jenkins/defaults/main.yml
- Tests:
Functional tests
Visual regression tests
G-Cloud¶
To make a new G-Cloud framework live:
Create an alias for the new index matching the latest live framework slug (e.g.
g-things-22
) on the datestamped index that includes the new services. Use theUpdate index alias - <STAGE>
job on Jenkins. Check that the number of docs on the Search API/_status
endpoint matches the number of expected live services and that the alias is present on the index.- Update the following jobs on Jenkins:
export_supplier_data_to_s3.yml
Update the Jenkins
search_config
to reference the correct framework(s) and apply the changes with:make reconfigure
This config will feed into the nightly
Update index - <STAGE>
jobs, and theClean and apply database dump - <STAGE>
job. Note that Jenkins may need to restart following this update.Add the new
g-things-23
index to the API’sDM_FRAMEWORK_TO_ES_INDEX
config setting and release the app to production, so that any supplier edits to live services are immediately re-indexed. (Note that this config is in the Data API, not the Search API!)
Note
Triple check everything’s ready with product and comms. Let’s go live!
Set the framework status to
live
using:curl -H "Authorization: Bearer <API_TOKEN>" -H "Content-Type: application/json" -X POST -d '{ "updated_by":"developers.email@digital.cabinet-office.gov.uk", "frameworks": { "status":"live" } }' https://<API_ENDPOINT>/frameworks/<FRAMEWORK_SLUG>
Check that the new service pages are displayed on the Digital Marketplace and can be found using search.
Pat yourself on the back - you’re live!
If you made a framework live on preview or staging, run the functional and visual regression tests against that environment and fix any test failures caused by the change in framework state.
6 - expired¶
G-Cloud services on the framework are still visible on the Digital Marketplace but show a banner explaining that the framework has expired and the service can no longer be procured. Services should no longer be returned by search - the service pages are only visible by direct navigation to them.
Set the G-Cloud framework status to expired
using:
curl -H "Authorization: Bearer <API_TOKEN>" -H "Content-Type: application/json" -X POST -d
'{
"updated_by":"developers.email@digital.cabinet-office.gov.uk",
"frameworks": {
"status": "expired"
}
}' https://<API_ENDPOINT>/frameworks/g-things-22
DOS frameworks are automatically expired when the new iteration is made live (see above) and do not need to be manually
set to expired
.