diff --git a/content/authentication/keeping-your-account-and-data-secure/removing-sensitive-data-from-a-repository.md b/content/authentication/keeping-your-account-and-data-secure/removing-sensitive-data-from-a-repository.md index d7282e98e86c..5cf648f9602b 100644 --- a/content/authentication/keeping-your-account-and-data-secure/removing-sensitive-data-from-a-repository.md +++ b/content/authentication/keeping-your-account-and-data-secure/removing-sensitive-data-from-a-repository.md @@ -20,9 +20,9 @@ shortTitle: Remove sensitive data ## About removing sensitive data from a repository -When altering your repository's history using tools like `git filter-repo` or the BFG Repo-Cleaner, it's crucial to understand the implications, especially regarding open pull requests and sensitive data. +When altering your repository's history using tools like `git filter-repo`, it's crucial to understand the implications, especially regarding open pull requests and sensitive data. -The `git filter-repo` tool and the BFG Repo-Cleaner rewrite your repository's history, which changes the SHAs for existing commits that you alter and any dependent commits. Changed commit SHAs may affect open pull requests in your repository. We recommend merging or closing all open pull requests before removing files from your repository. +The `git filter-repo` tool rewrites your repository's history, which changes the SHAs for existing commits that you alter and any dependent commits. Changed commit SHAs may affect open pull requests in your repository. We recommend merging or closing all open pull requests before removing files from your repository. You can remove the file from the latest commit with `git rm`. For information on removing a file that was added with the latest commit, see "[AUTOTITLE](/repositories/working-with-files/managing-large-files/about-large-files-on-github#removing-files-from-a-repositorys-history)." @@ -48,37 +48,7 @@ If the commit that introduced the sensitive data exists in any forks, it will co Consider these limitations and challenges in your decision to rewrite your repository's history. -## Purging a file from your repository's history - -You can purge a file from your repository's history using either the `git filter-repo` tool or the BFG Repo-Cleaner open source tool. - -> [!NOTE] If sensitive data is located in a file that's identified as a binary file, you'll need to remove the file from the history, as you can't modify it to remove or replace the data. - -### Using the BFG - -The [BFG Repo-Cleaner](https://rtyley.github.io/bfg-repo-cleaner/) is a tool that's built and maintained by the open source community. It provides a faster, simpler alternative to `git filter-repo` for removing unwanted data. - -For example, to remove your file with sensitive data and leave your latest commit untouched, run: - -```shell -bfg --delete-files YOUR-FILE-WITH-SENSITIVE-DATA -``` - -To replace all text listed in `passwords.txt` wherever it can be found in your repository's history, run: - -```shell -bfg --replace-text passwords.txt -``` - -After the sensitive data is removed, you must force push your changes to {% data variables.product.product_name %}. Force pushing rewrites the repository history, which removes sensitive data from the commit history. If you force push, it may overwrite commits that other people have based their work on. - -```shell -git push --force -``` - -See the [BFG Repo-Cleaner](https://rtyley.github.io/bfg-repo-cleaner/)'s documentation for full usage and download instructions. - -### Using git filter-repo +## Purging a file from your repository's history using git-filter-repo > [!WARNING] If you run `git filter-repo` after stashing changes, you won't be able to retrieve your changes with other stash commands. Before running `git filter-repo`, we recommend unstashing any changes you've made. To unstash the last set of changes you've stashed, run `git stash show -p | git apply -R`. For more information, see [Git Tools - Stashing and Cleaning](https://git-scm.com/book/en/v2/Git-Tools-Stashing-and-Cleaning). @@ -178,7 +148,7 @@ To illustrate how `git filter-repo` works, we'll show you how to remove your fil ## Fully removing the data from {% data variables.product.prodname_dotcom %} -After using either the BFG tool or `git filter-repo` to remove the sensitive data and pushing your changes to {% data variables.product.product_name %}, you must take a few more steps to fully remove the data from {% data variables.product.product_name %}. +After using `git filter-repo` to remove the sensitive data and pushing your changes to {% data variables.product.product_name %}, you must take a few more steps to fully remove the data from {% data variables.product.product_name %}. {% ifversion ghec %} 1. If the repository was migrated using the {% data variables.product.prodname_importer_proper_name %}, there may be some non-standard Git references that follow the pattern `refs/github-services`, that neither the BFG tool or `git filter-repo` can remove. In this case, remove those references running the following commands in your local copy of the repository: @@ -205,22 +175,6 @@ After using either the BFG tool or `git filter-repo` to remove the sensitive dat 1. Tell your collaborators to [rebase](https://git-scm.com/book/en/v2/Git-Branching-Rebasing), _not_ merge, any branches they created off of your old (tainted) repository history. One merge commit could reintroduce some or all of the tainted history that you just went to the trouble of purging. -1. If you used `git filter-repo`, you can skip this step. - - If you used the BFG tool, after rewriting, you can clean up references in your local repository to the old history to be dereferenced and garbage collected with the following commands (using Git 1.8.5 or newer): - - ```shell - $ git reflog expire --expire=now --all - $ git gc --prune=now - > Counting objects: 2437, done. - > Delta compression using up to 4 threads. - > Compressing objects: 100% (1378/1378), done. - > Writing objects: 100% (2437/2437), done. - > Total 2437 (delta 1461), reused 1802 (delta 1048) - ``` - - > [!NOTE] You can also achieve this by pushing your filtered history to a new or empty repository and then making a fresh clone from {% data variables.product.product_name %}. - {% ifversion ghes %} ## Identifying reachable commits @@ -245,7 +199,7 @@ If references are found in any forks, the results will look similar, but will st ghe-nwo NWO ``` -The same procedure using the BFG tool or `git filter-repo` can be used to remove the sensitive data from the repository's forks. Alternatively, the forks can be deleted altogether, and if needed, the repository can be re-forked once the cleanup of the root repository is complete. +The same procedure using `git filter-repo` can be used to remove the sensitive data from the repository's forks. Alternatively, the forks can be deleted altogether, and if needed, the repository can be re-forked once the cleanup of the root repository is complete. Once you have removed the commit's references, re-run the commands to double-check. diff --git a/content/code-security/getting-started/best-practices-for-preventing-data-leaks-in-your-organization.md b/content/code-security/getting-started/best-practices-for-preventing-data-leaks-in-your-organization.md index 88e4241c1029..c12dfa477933 100644 --- a/content/code-security/getting-started/best-practices-for-preventing-data-leaks-in-your-organization.md +++ b/content/code-security/getting-started/best-practices-for-preventing-data-leaks-in-your-organization.md @@ -105,7 +105,7 @@ To ensure that all code is properly reviewed prior to being merged into the defa ## Mitigate data leaks -If a user pushes sensitive data, ask them to remove it by using the `git filter-repo` tool or the BFG Repo-Cleaner open source tool. For more information, see "[AUTOTITLE](/authentication/keeping-your-account-and-data-secure/removing-sensitive-data-from-a-repository)." Also, it is possible to revert almost anything in Git. For more information, see [{% data variables.product.prodname_blog %}](https://github.blog/2015-06-08-how-to-undo-almost-anything-with-git/). +If a user pushes sensitive data, ask them to remove it by using the `git filter-repo` tool. For more information, see "[AUTOTITLE](/authentication/keeping-your-account-and-data-secure/removing-sensitive-data-from-a-repository)." Also, it is possible to revert almost anything in Git. For more information, see [{% data variables.product.prodname_blog %}](https://github.blog/2015-06-08-how-to-undo-almost-anything-with-git/). At the organization level, if you're unable to coordinate with the user who pushed the sensitive data to remove it, we recommend you contact {% data variables.contact.contact_support %} with the concerning commit SHA. diff --git a/content/code-security/secret-scanning/introduction/about-secret-scanning.md b/content/code-security/secret-scanning/introduction/about-secret-scanning.md index 98849b7cad72..494efd3b37c9 100644 --- a/content/code-security/secret-scanning/introduction/about-secret-scanning.md +++ b/content/code-security/secret-scanning/introduction/about-secret-scanning.md @@ -56,7 +56,7 @@ Below is a typical workflow that explains how {% data variables.product.prodname * **Remediation:** You then need to take appropriate actions to remediate the exposure. This might include: * Rotating the affected credential to ensure it is no longer usable. - * Removing the secret from the repository's history (using tools like BFG Repo-Cleaner or {% data variables.product.prodname_dotcom %}'s built-in features). + * Removing the secret from the repository's history (using tools like `git-filter-repo` or {% data variables.product.prodname_dotcom %}'s built-in features). * **Monitoring:** It's good practice to regularly audit and monitor your repositories to ensure no other secrets are exposed. diff --git a/content/code-security/secret-scanning/using-advanced-secret-scanning-and-push-protection-features/custom-patterns/defining-custom-patterns-for-secret-scanning.md b/content/code-security/secret-scanning/using-advanced-secret-scanning-and-push-protection-features/custom-patterns/defining-custom-patterns-for-secret-scanning.md index 5fa733d85fc6..4f5b8348758d 100644 --- a/content/code-security/secret-scanning/using-advanced-secret-scanning-and-push-protection-features/custom-patterns/defining-custom-patterns-for-secret-scanning.md +++ b/content/code-security/secret-scanning/using-advanced-secret-scanning-and-push-protection-features/custom-patterns/defining-custom-patterns-for-secret-scanning.md @@ -138,12 +138,8 @@ After your pattern is created, {% data variables.product.prodname_secret_scannin Before defining a custom pattern, you must ensure that you enable secret scanning for your enterprise account. For more information, see "[Enabling {% data variables.product.prodname_GH_advanced_security %} for your enterprise]({% ifversion fpt or ghec %}/enterprise-server@latest/{% endif %}/admin/advanced-security/enabling-github-advanced-security-for-your-enterprise)." > [!NOTE] -{% ifversion custom-pattern-dry-run-ga %} > * At the enterprise level, only the creator of a custom pattern can edit the pattern, and use it in a dry run. > * {% data reusables.secret-scanning.dry-runs-enterprise-permissions %} -{% else %} -> As there is no dry-run functionality, we recommend that you test your custom patterns in a repository before defining them for your entire enterprise. That way, you can avoid creating excess false-positive {% data variables.secret-scanning.alerts %}. -{% endif %} {% data reusables.enterprise-accounts.access-enterprise %} {% data reusables.enterprise-accounts.policies-tab %}{% ifversion security-feature-enablement-policies %} diff --git a/content/copilot/managing-copilot/managing-copilot-for-your-enterprise/managing-policies-and-features-for-copilot-in-your-enterprise.md b/content/copilot/managing-copilot/managing-copilot-for-your-enterprise/managing-policies-and-features-for-copilot-in-your-enterprise.md index 452b8cf88fc7..c22a7fb7dd81 100644 --- a/content/copilot/managing-copilot/managing-copilot-for-your-enterprise/managing-policies-and-features-for-copilot-in-your-enterprise.md +++ b/content/copilot/managing-copilot/managing-copilot-for-your-enterprise/managing-policies-and-features-for-copilot-in-your-enterprise.md @@ -88,6 +88,10 @@ By default, {% data variables.product.prodname_copilot_chat_short %} uses the `G * `o1-preview`: This model is focused on advanced reasoning and solving complex problems, in particular in math and science. It responds more slowly than the `gpt-4o` model. Each member of your enterprise can make 10 requests to this model per day. * `o1-mini`: This is the faster version of the `o1-preview` model, balancing the use of complex reasoning with the need for faster responses. It is best suited for code generation and small context operations. Each member of your enterprise can make 50 requests to this model per day. +### {% data variables.product.prodname_copilot_short %} Metrics API access + +Enable this policy to allow users to use the {% data variables.product.prodname_copilot_short %} Metrics API. See "[AUTOTITLE](/rest/copilot/copilot-metrics)." + ## Configuring policies for {% data variables.product.prodname_copilot %} {% data reusables.enterprise-accounts.access-enterprise %} diff --git a/content/copilot/managing-copilot/managing-github-copilot-in-your-organization/reviewing-activity-related-to-github-copilot-in-your-organization/analyzing-usage-over-time-with-the-copilot-metrics-api.md b/content/copilot/managing-copilot/managing-github-copilot-in-your-organization/reviewing-activity-related-to-github-copilot-in-your-organization/analyzing-usage-over-time-with-the-copilot-metrics-api.md new file mode 100644 index 000000000000..fd6661a55a30 --- /dev/null +++ b/content/copilot/managing-copilot/managing-github-copilot-in-your-organization/reviewing-activity-related-to-github-copilot-in-your-organization/analyzing-usage-over-time-with-the-copilot-metrics-api.md @@ -0,0 +1,303 @@ +--- +title: Analyzing usage over time with the Copilot metrics API +shortTitle: Copilot metrics API +intro: 'Learn how to connect to the API, store data, and analyze usage trends.' +versions: + feature: copilot +product: '{% data variables.product.prodname_copilot_for_business %} or {% data variables.product.prodname_copilot_enterprise %}' +permissions: 'Organization owners, enterprise owners, and billing managers' +layout: inline +topics: + - Copilot +--- + +## Introduction + +You can use the [AUTOTITLE](/rest/copilot/copilot-metrics) to see trends in how users are adopting {% data variables.product.prodname_copilot %}. During a rollout of {% data variables.product.prodname_copilot %}, it's useful to view these trends to check that people are using their assigned licenses, see which features people are using, and understand the effect of your company's enablement plan on developers. + +The API includes: + +* Data for the last 28 days +* Numbers of active users and engaged users +* Breakdowns by language and IDE +* The option to view metrics for an enterprise, organization, or team + +If you currently use the [AUTOTITLE](/rest/copilot/copilot-usage), we recommend migrating to the [AUTOTITLE](/rest/copilot/copilot-metrics) as soon as possible. + +This guide demonstrates how to query the API, store data, and analyze a trend for changes to the number of users per week. The examples in this guide use the endpoint for an organization, but you can adapt the examples to meet your needs. + +## About endpoint availability + +Endpoints are available to get data for an enterprise, organization, organization team, or enterprise team. + +* If you have a {% data variables.product.prodname_copilot_for_business %} or {% data variables.product.prodname_copilot_enterprise %} subscription as part of a regular organization or enterprise, you can use the endpoints for **an enterprise, an organization, or an organization team**. You don't have access to enterprise teams unless you're enrolled in a preview. +* If you use a dedicated enterprise for {% data variables.product.prodname_copilot_for_business %}—an enterprise account without the ability to create organizations—you can use the endpoints for **an enterprise or an enterprise team**. + +## Prerequisites + +* The **{% data variables.product.prodname_copilot_short %} metrics API access** policy must be enabled for your enterprise or organization. See "[AUTOTITLE](/copilot/managing-copilot/managing-github-copilot-in-your-organization/managing-policies-for-copilot-in-your-organization)" or "[AUTOTITLE](/enterprise-cloud@latest/copilot/managing-copilot/managing-copilot-for-your-enterprise/managing-policies-and-features-for-copilot-in-your-enterprise)." +* The organization, enterprise, or team that you're querying must have enough active {% data variables.product.prodname_copilot_short %} users. The API only returns results for a given day if there are **five or more members with active {% data variables.product.prodname_copilot_short %} licenses** for that day. +* In this example, we'll create a JavaScript script for querying and analyzing the data. To run this script locally, you must install [Node.js](https://nodejs.org/en), then install the [Octokit.js SDK](https://github.com/octokit/octokit.js#usage) with `npm install -g octokit`. + +## 1. Create a {% data variables.product.pat_generic %} + +For our example, to get metrics for an organization, we'll create a {% data variables.product.pat_v1 %} with the `manage_billing:copilot` scope. See "[AUTOTITLE](/authentication/keeping-your-account-and-data-secure/managing-your-personal-access-tokens#creating-a-personal-access-token-classic)." + +If you're using another endpoint, you may need different scopes. See "[AUTOTITLE](/rest/copilot/copilot-metrics)." + +## 2. Connect to the API + +We will call the API from a script and save the response as a variable. We can then store the data externally and analyze trends in the data. + +The following example uses the [Octokit client](https://github.com/octokit) for JavaScript. You can use other methods to call the API, such as cURL or the {% data variables.product.prodname_cli %}. + +### Example + +In this example: + +* Replace YOUR_TOKEN with your {% data variables.product.pat_generic %}. +* Replace YOUR_ORG with your organization name, such as `octo-org`. + +```javascript annotate +// Import Octokit +import { Octokit } from "octokit"; + +// Set your token and organization +const octokit = new Octokit({ + auth: 'YOUR_TOKEN' +}); +const org = 'YOUR_ORG'; + +// Set other variables if required for the endpoint you're using +/* +const team = 'YOUR_TEAM'; +const enterprise = 'YOUR_ENTERPRISE'; +const entTeam = 'YOUR_ENTERPRISE_TEAM'; +*/ + +// Call the API +async function orgMetrics() { + const resp = await octokit.request(`GET /orgs/${org}/copilot/metrics`, { + org: 'ORG', + headers: { + 'X-GitHub-Api-Version': '2022-11-28' + } + }); + + const copilotUsage = resp.data; + + console.log(copilotUsage); + } + +// Call the function +orgMetrics(); +``` + +### Run the script locally + +To test the script locally, save the file as `copilot.mjs`, then run `node copilot.mjs`. + +>[!IMPORTANT] The **.mjs** file type is important. The `import { Octokit }` statement may not work with a regular `.js` file. + +In your terminal, you should see output with a JSON array like the following. + +```json +[ + { + date: '2024-11-07', + copilot_ide_chat: { editors: [Array], total_engaged_users: 14 }, + total_active_users: 28, + copilot_dotcom_chat: { models: [Array], total_engaged_users: 4 }, + total_engaged_users: 28, + copilot_dotcom_pull_requests: { total_engaged_users: 0 }, + copilot_ide_code_completions: { editors: [Array], total_engaged_users: 22 } + }, +... +``` + +## 3. Store the data + +To analyze trends over longer than 28 days, you will need to: + +* Call the API daily, using a cron job or scheduled {% data variables.product.prodname_actions %} workflow. +* Store data locally or with a database service such as MySQL. +* Query the data to identify trends over time. + +### Example + +In this example we'll save the data to a local `.json` file. To do this, we will import some modules for working with files, and update the `orgMetrics` function to save the response data. + +The function saves new data that is returned each day, without overwriting old data in the file. + +New steps are annotated in **bold**. + +```javascript annotate +// Import Octokit +import { Octokit } from "octokit"; + +// **Import modules for working with files** +import path from 'path'; +import fs from 'fs'; +import { fileURLToPath } from 'url'; +import { dirname } from 'path'; + +// **Declare variables for working with files** +const __filename = fileURLToPath(import.meta.url); +const __dirname = dirname(__filename); + +// Set your token and organization +const octokit = new Octokit({ + auth: 'YOUR_TOKEN' +}); + +const org = 'YOUR_ORG'; + +// Call the API +async function orgMetrics() { + const resp = await octokit.request(`GET /orgs/${org}/copilot/metrics`, { + org: 'ORG', + headers: { + 'X-GitHub-Api-Version': '2022-11-28' + } + }); + + const copilotUsage = resp.data; + + // **Define the path to the local file where data will be stored** + const dataFilePath = path.join(__dirname, 'copilotMetricsData.json'); + + // **Read existing data from the file, if it exists** + let existingData = []; + if (fs.existsSync(dataFilePath)) { + const fileContent = fs.readFileSync(dataFilePath, 'utf8'); + existingData = JSON.parse(fileContent); + } + + // **Filter out the new data that is not already in the existing data** + const newData = copilotUsage.filter(entry => !existingData.some(existingEntry => existingEntry.date === entry.date)); + + // **Append new data to the existing data** + if (newData.length > 0) { + existingData = existingData.concat(newData); + + // **Save the updated data back to the file** + fs.writeFileSync(dataFilePath, JSON.stringify(existingData, null, 2)); + console.log(`Saved ${newData.length} new entries.`); + } else { + console.log('No new data to save.'); + } +} + +// Call the function +orgMetrics(); +``` + +### Run the script locally + +After running the script with `node copilot.mjs`, you should have a new file in your directory called `copilotMetricsData.json`. The file should contain data from the API response. + +If you run the script again tomorrow, it should only save data for one new day to the file. + +## 4. Analyze trends + +You can work with the data from the API to identify trends over the last 28 days or, if you've stored data from previous API calls, over a longer period. + +### Example + +In the following example, we update the `orgMetrics` function to extract the total and average number of active and engaged users per week. We could then use that data to track changes over time. This example uses the data returned directly from the API, and doesn't require stored data. + +New steps are annotated in **bold**. + +```javascript annotate +// Call the API +async function orgMetrics() { + const resp = await octokit.request(`GET /orgs/${org}/copilot/metrics`, { + org: 'ORG', + headers: { + 'X-GitHub-Api-Version': '2022-11-28' + } + }); + + const copilotUsage = resp.data; + + // **Create an object to store data for each week** + let userTrends ={ + week1: { + days:0, + activeUsers:0, + engagedUsers:0, + }, + week2: { + days:0, + activeUsers:0, + engagedUsers:0, + }, + week3: { + days:0, + activeUsers:0, + engagedUsers:0, + }, + week4: { + days:0, + activeUsers:0, + engagedUsers:0, + }, + }; + + // **Iterate over the data** + for (let i =0; i [!TIP] -> If you get an error that "this exceeds {% data variables.large_files.product_name_short %}'s file size limit of {% data variables.large_files.max_github_size %}" when you try to push files to Git, you can use `git lfs migrate` instead of `filter-repo` or the BFG Repo Cleaner, to move the large file to {% data variables.large_files.product_name_long %}. For more information about the `git lfs migrate` command, see the [Git LFS 2.2.0](https://github.com/blog/2384-git-lfs-2-2-0-released) release announcement. +> If you get an error that "this exceeds {% data variables.large_files.product_name_short %}'s file size limit of {% data variables.large_files.max_github_size %}" when you try to push files to Git, you can use `git lfs migrate` instead of `filter-repo`, to move the large file to {% data variables.large_files.product_name_long %}. For more information about the `git lfs migrate` command, see the [Git LFS 2.2.0](https://github.com/blog/2384-git-lfs-2-2-0-released) release announcement. -1. Remove the file from the repository's Git history using either the `filter-repo` command or BFG Repo-Cleaner. For detailed information on using these, see "[AUTOTITLE](/authentication/keeping-your-account-and-data-secure/removing-sensitive-data-from-a-repository)." +1. Remove the file from the repository's Git history using the `filter-repo` command. For detailed information on using these, see "[AUTOTITLE](/authentication/keeping-your-account-and-data-secure/removing-sensitive-data-from-a-repository)." 1. Configure tracking for your file and push it to {% data variables.large_files.product_name_short %}. For more information on this procedure, see "[AUTOTITLE](/repositories/working-with-files/managing-large-files/configuring-git-large-file-storage)." ## Further reading diff --git a/content/repositories/working-with-files/managing-large-files/removing-files-from-git-large-file-storage.md b/content/repositories/working-with-files/managing-large-files/removing-files-from-git-large-file-storage.md index 210a79337500..f8b13176f37f 100644 --- a/content/repositories/working-with-files/managing-large-files/removing-files-from-git-large-file-storage.md +++ b/content/repositories/working-with-files/managing-large-files/removing-files-from-git-large-file-storage.md @@ -13,7 +13,7 @@ shortTitle: Remove files --- ## Removing a single file -1. Remove the file from the repository's Git history using either the `filter-repo` command or BFG Repo-Cleaner. For detailed information on using these, see "[AUTOTITLE](/authentication/keeping-your-account-and-data-secure/removing-sensitive-data-from-a-repository)." +1. Remove the file from the repository's Git history using the `filter-repo` command. For detailed information on using these, see "[AUTOTITLE](/authentication/keeping-your-account-and-data-secure/removing-sensitive-data-from-a-repository)." 1. Navigate to your _.gitattributes_ file. > [!NOTE] @@ -24,7 +24,7 @@ shortTitle: Remove files ## Removing all files within a {% data variables.large_files.product_name_short %} repository -1. Remove the files from the repository's Git history using either the `filter-repo` command or BFG Repo-Cleaner. For detailed information on using these, see "[AUTOTITLE](/authentication/keeping-your-account-and-data-secure/removing-sensitive-data-from-a-repository)." +1. Remove the files from the repository's Git history using the `filter-repo` command. For detailed information on using these, see "[AUTOTITLE](/authentication/keeping-your-account-and-data-secure/removing-sensitive-data-from-a-repository)." 1. Optionally, to uninstall {% data variables.large_files.product_name_short %} in the repository, run: ```shell diff --git a/content/rest/copilot/copilot-metrics.md b/content/rest/copilot/copilot-metrics.md index 55282cd999c9..ed8a35e96162 100644 --- a/content/rest/copilot/copilot-metrics.md +++ b/content/rest/copilot/copilot-metrics.md @@ -11,4 +11,15 @@ autogenerated: rest allowTitleToDifferFromFilename: true --- +You can use these endpoints to get a breakdown of aggregated metrics for various {% data variables.product.prodname_copilot %} features. The API includes: + +* Data for the last 28 days +* Numbers of active users and engaged users +* Breakdowns by language and IDE +* The option to view metrics for an enterprise, organization, or team + +If you currently use the [AUTOTITLE](/rest/copilot/copilot-usage), we recommend migrating to these endpoints as soon as possible. + +For help getting started, see "[AUTOTITLE](/copilot/managing-copilot/managing-github-copilot-in-your-organization/reviewing-activity-related-to-github-copilot-in-your-organization/analyzing-usage-over-time-with-the-copilot-metrics-api)." +