Initial commit

pull/4/head
Josh Gross 5 years ago
parent 551cf17d91
commit 37c45447e4

@ -0,0 +1,20 @@
name: Test Cache Action
on:
pull_request:
push:
branches:
- master
jobs:
test:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v1
- run: npm ci
- name: Prettier Format Check
run: npm run format-check
- name: Build & Test
run: npm run test

96
.gitignore vendored

@ -0,0 +1,96 @@
__tests__/runner/*
# comment out in distribution branches
dist/
node_modules/
lib/
# Rest pulled from https://github.com/github/gitignore/blob/master/Node.gitignore
# Logs
logs
*.log
npm-debug.log*
yarn-debug.log*
yarn-error.log*
lerna-debug.log*
# Diagnostic reports (https://nodejs.org/api/report.html)
report.[0-9]*.[0-9]*.[0-9]*.[0-9]*.json
# Runtime data
pids
*.pid
*.seed
*.pid.lock
# Directory for instrumented libs generated by jscoverage/JSCover
lib-cov
# Coverage directory used by tools like istanbul
coverage
*.lcov
# nyc test coverage
.nyc_output
# Grunt intermediate storage (https://gruntjs.com/creating-plugins#storing-task-files)
.grunt
# Bower dependency directory (https://bower.io/)
bower_components
# node-waf configuration
.lock-wscript
# Compiled binary addons (https://nodejs.org/api/addons.html)
build/Release
# Dependency directories
jspm_packages/
# TypeScript v1 declaration files
typings/
# TypeScript cache
*.tsbuildinfo
# Optional npm cache directory
.npm
# Optional eslint cache
.eslintcache
# Optional REPL history
.node_repl_history
# Output of 'npm pack'
*.tgz
# Yarn Integrity file
.yarn-integrity
# dotenv environment variables file
.env
.env.test
# parcel-bundler cache (https://parceljs.org/)
.cache
# next.js build output
.next
# nuxt.js build output
.nuxt
# vuepress build output
.vuepress/dist
# Serverless directories
.serverless/
# FuseBox cache
.fusebox/
# DynamoDB Local files
.dynamodb/

@ -0,0 +1,11 @@
{
"printWidth": 80,
"tabWidth": 4,
"useTabs": false,
"semi": true,
"singleQuote": false,
"trailingComma": "none",
"bracketSpacing": true,
"arrowParens": "avoid",
"parser": "typescript"
}

@ -0,0 +1,17 @@
{
// Use IntelliSense to learn about possible attributes.
// Hover to view descriptions of existing attributes.
// For more information, visit: https://go.microsoft.com/fwlink/?linkid=830387
"version": "0.2.0",
"configurations": [
{
"type": "node",
"request": "launch",
"name": "Jest Test",
"program": "${workspaceFolder}/node_modules/jest/bin/jest",
"args": ["--runInBand", "--config=${workspaceFolder}/jest.config.js"],
"console": "integratedTerminal",
"internalConsoleOptions": "neverOpen"
},
]
}

@ -0,0 +1,34 @@
## Contributing
[fork]: https://github.com/actions/cache/fork
[pr]: https://github.com/actions/cache/compare
[style]: https://github.com/styleguide/js
[code-of-conduct]: CODE_OF_CONDUCT.md
Hi there! We're thrilled that you'd like to contribute to this project. Your help is essential for keeping it great.
Contributions to this project are [released](https://help.github.com/articles/github-terms-of-service/#6-contributions-under-repository-license) to the public under the [project's open source license](LICENSE).
Please note that this project is released with a [Contributor Code of Conduct][code-of-conduct]. By participating in this project you agree to abide by its terms.
## Submitting a pull request
1. [Fork][fork] and clone the repository
2. Configure and install the dependencies: `npm install`
3. Make sure the tests pass on your machine: `npm run test`
4. Create a new branch: `git checkout -b my-branch-name`
5. Make your change, add tests, and make sure the tests still pass
6. Push to your fork and [submit a pull request][pr]
7. Pat your self on the back and wait for your pull request to be reviewed and merged.
Here are a few things you can do that will increase the likelihood of your pull request being accepted:
- Write tests.
- Keep your change as focused as possible. If there are multiple changes you would like to make that are not dependent upon each other, consider submitting them as separate pull requests.
- Write a [good commit message](http://tbaggery.com/2008/04/19/a-note-about-git-commit-messages.html).
## Resources
- [How to Contribute to Open Source](https://opensource.guide/how-to-contribute/)
- [Using Pull Requests](https://help.github.com/articles/about-pull-requests/)
- [GitHub Help](https://help.github.com)

@ -0,0 +1,22 @@
The MIT License (MIT)
Copyright (c) 2018 GitHub, Inc. and contributors
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in
all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
THE SOFTWARE.

@ -1 +1,133 @@
# cache
This GitHub Action allows caching dependencies and build outputs to improve workflow execution time.
## Usage
### Pre-requisites
Create a workflow `.yml` file in your repositories `.github/workflows` directory. An [example workflow](#example-workflow) is available below. For more information, reference the GitHub Help Documentation for [Creating a workflow file](https://help.github.com/en/articles/configuring-a-workflow#creating-a-workflow-file).
### Inputs
* `path` - A directory to store and save the cache
* `key` - An explicit key for restoring and saving the cache
* `restore-keys` - An ordered list of keys to use for restoring the cache if no cache hit occurred for key
### Outputs
* `cache-hit` - A boolean value to indicate an exact match was found for the key
> See [Skipping steps based on cache-hit](#Skipping-steps-based-on-cache-hit) for info on using this output
### Example workflow
```yaml
name: Example Caching with npm
on: push
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v1
- name: Cache node_modules
uses: actions/cache@preview
with:
path: node_modules
key: ${{ runner.os }}-node
- name: Install Dependencies
run: npm install
- name: Build
run: npm run build
- name: Test
run: npm run test
```
## Ecosystem Examples
### Node - npm
```yaml
- uses: actions/cache@preview
with:
path: node_modules
key: ${{ runner.os }}-node
```
### Node - Yarn
```yaml
- uses: actions/cache@preview
with:
path: ~/.cache/yarn
key: ${{ runner.os }}-yarn-${{ hashFiles(format('{0}{1}', github.workspace, '/yarn.lock')) }}
restore-keys: |
${{ runner.os }}-yarn-
```
### C# - Nuget
```yaml
- uses: actions/cache@preview
with:
path: ~/.nuget/packages
key: ${{ runner.os }}-nuget-${{ hashFiles('**/packages.lock.json') }}
restore-keys: |
${{ runner.os }}-nuget-
```
### Java - Gradle
```yaml
- uses: actions/cache@preview
with:
path: ~/.gradle/caches
key: gradle-${{ runner.os }}-${{ hashFiles('**/*.gradle') }}
restore-keys: |
gradle-${{ runner.os }}-
```
### Java - Maven
```yaml
- uses: actions/cache@preview
with:
path: ~/.m2/repository
key: ${{ runner.os }}-maven
```
## Cache Limits
Individual caches are limited to 200MB and a repository can have up to 2GB of caches. Once the 2GB limit is reached, older caches will be evicted based on when the cache was last accessed.
## Skipping steps based on cache-hit
Using the `cache-hit` output, subsequent steps (such as install or build) can be skipped when a cache hit occurs on the key.
Example:
```yaml
steps:
- uses: actions/checkout@v1
- uses: actions/cache@preview
id: cache
with:
path: path/to/dependencies
key: ${{ runner.os }}-${{ hashFiles('**/lockfiles')}}
- name: Install Dependencies
if: steps.cache.outputs.cache-hit != 'true'
run: /install.sh
```
> Note: The `id` defined in `actions/cache` must match the `id` in the `if` statement (i.e. `steps.[ID].outupts.cache-hit`)
## Contributing
We would love for you to contribute to `@actions/cache`, pull requests are welcome! Please see the [CONTRIBUTING.md](CONTRIBUTING.md) for more information.
## License
The scripts and documentation in this project are released under the [MIT License](LICENSE)

@ -0,0 +1,22 @@
import * as core from "@actions/core";
import { Inputs } from "../src/constants";
import run from "../src/restore";
import * as testUtils from "../src/utils/testUtils";
test("restore with no path", async () => {
const failedMock = jest.spyOn(core, "setFailed");
await run();
expect(failedMock).toHaveBeenCalledWith(
"Input required and not supplied: path"
);
});
test("restore with no key", async () => {
testUtils.setInput(Inputs.Path, "node_modules");
const failedMock = jest.spyOn(core, "setFailed");
await run();
expect(failedMock).toHaveBeenCalledWith(
"Input required and not supplied: key"
);
});

@ -0,0 +1,24 @@
name: 'Cache'
description: 'Cache dependencies and build outputs to improve workflow execution time'
author: 'GitHub'
inputs:
path:
description: 'A directory to store and save the cache'
required: true
key:
description: 'An explicit key for restoring and saving the cache'
required: true
restore-keys:
description: 'An ordered list of keys to use for restoring the cache if no cache hit occurred for key'
required: false
outputs:
cache-hit:
description: 'A boolean value to indicate an exact match was found for the primary key'
runs:
using: 'node12'
main: 'dist/restore/index.js'
post: 'dist/save/index.js'
post-if: 'success()'
branding:
icon: 'archive'
color: 'gray-dark'

@ -0,0 +1,11 @@
module.exports = {
clearMocks: true,
moduleFileExtensions: ['js', 'ts'],
testEnvironment: 'node',
testMatch: ['**/*.test.ts'],
testRunner: 'jest-circus/runner',
transform: {
'^.+\\.ts$': 'ts-jest'
},
verbose: true
}

5315
package-lock.json generated

File diff suppressed because it is too large Load Diff

@ -0,0 +1,43 @@
{
"name": "cache",
"version": "0.0.1",
"private": true,
"description": "Cache dependencies and build outputs",
"main": "dist/restore/index.js",
"scripts": {
"build": "tsc",
"test": "tsc --noEmit && jest --coverage",
"format": "prettier --write **/*.ts",
"format-check": "prettier --check **/*.ts",
"release": "ncc build -o dist/restore src/restore.ts && ncc build -o dist/save src/save.ts && git add -f dist/"
},
"repository": {
"type": "git",
"url": "git+https://github.com/actions/cache.git"
},
"keywords": [
"actions",
"node",
"cache"
],
"author": "GitHub",
"license": "MIT",
"dependencies": {
"@actions/core": "^1.2.0",
"@actions/exec": "^1.0.1",
"@actions/io": "^1.0.1",
"typed-rest-client": "^1.5.0",
"uuid": "^3.3.3"
},
"devDependencies": {
"@types/jest": "^24.0.13",
"@types/node": "^12.0.4",
"@types/uuid": "^3.4.5",
"@zeit/ncc": "^0.20.5",
"jest": "^24.8.0",
"jest-circus": "^24.7.1",
"prettier": "1.18.2",
"ts-jest": "^24.0.2",
"typescript": "^3.5.1"
}
}

@ -0,0 +1,126 @@
import * as core from "@actions/core";
import * as fs from "fs";
import { BearerCredentialHandler } from "typed-rest-client/Handlers";
import { HttpClient } from "typed-rest-client/HttpClient";
import { IHttpClientResponse } from "typed-rest-client/Interfaces";
import { RestClient, IRequestOptions } from "typed-rest-client/RestClient";
import { ArtifactCacheEntry } from "./contracts";
export async function getCacheEntry(
keys: string[]
): Promise<ArtifactCacheEntry> {
const cacheUrl = getCacheUrl();
const token = process.env["ACTIONS_RUNTIME_TOKEN"] || "";
const bearerCredentialHandler = new BearerCredentialHandler(token);
const resource = `_apis/artifactcache/cache?keys=${encodeURIComponent(
keys.join(",")
)}`;
const restClient = new RestClient("actions/cache", cacheUrl, [
bearerCredentialHandler
]);
const response = await restClient.get<ArtifactCacheEntry>(
resource,
getRequestOptions()
);
if (response.statusCode === 204) {
throw new Error(
`Cache not found for input keys: ${JSON.stringify(keys)}.`
);
}
if (response.statusCode !== 200) {
throw new Error(`Cache service responded with ${response.statusCode}`);
}
const cacheResult = response.result;
core.debug(`Cache Result:`);
core.debug(JSON.stringify(cacheResult));
if (!cacheResult || !cacheResult.archiveLocation) {
throw new Error("Cache not found.");
}
return cacheResult;
}
export async function downloadCache(
cacheEntry: ArtifactCacheEntry,
archivePath: string
): Promise<void> {
const stream = fs.createWriteStream(archivePath);
const httpClient = new HttpClient("actions/cache");
const downloadResponse = await httpClient.get(cacheEntry.archiveLocation!);
await pipeResponseToStream(downloadResponse, stream);
}
async function pipeResponseToStream(
response: IHttpClientResponse,
stream: NodeJS.WritableStream
): Promise<void> {
return new Promise(resolve => {
response.message.pipe(stream).on("close", () => {
resolve();
});
});
}
export async function saveCache(stream: NodeJS.ReadableStream, key: string) {
const cacheUrl = getCacheUrl();
const token = process.env["ACTIONS_RUNTIME_TOKEN"] || "";
const bearerCredentialHandler = new BearerCredentialHandler(token);
const resource = `_apis/artifactcache/cache/${encodeURIComponent(key)}`;
const postUrl = cacheUrl + resource;
const restClient = new RestClient("actions/cache", undefined, [
bearerCredentialHandler
]);
const requestOptions = getRequestOptions();
requestOptions.additionalHeaders = {
"Content-Type": "application/octet-stream"
};
const response = await restClient.uploadStream<void>(
"POST",
postUrl,
stream,
requestOptions
);
if (response.statusCode !== 200) {
throw new Error(`Cache service responded with ${response.statusCode}`);
}
core.info("Cache saved successfully");
}
function getRequestOptions(): IRequestOptions {
const requestOptions: IRequestOptions = {
acceptHeader: createAcceptHeader("application/json", "5.2-preview.1")
};
return requestOptions;
}
function createAcceptHeader(type: string, apiVersion: string): string {
return `${type};api-version=${apiVersion}`;
}
function getCacheUrl(): string {
// Ideally we just use ACTIONS_CACHE_URL
let cacheUrl: string = (
process.env["ACTIONS_CACHE_URL"] ||
process.env["ACTIONS_RUNTIME_URL"] ||
""
).replace("pipelines", "artifactcache");
if (!cacheUrl) {
throw new Error(
"Cache Service Url not found, unable to restore cache."
);
}
core.debug(`Cache Url: ${cacheUrl}`);
return cacheUrl;
}

@ -0,0 +1,14 @@
export namespace Inputs {
export const Key = "key";
export const Path = "path";
export const RestoreKeys = "restore-keys";
}
export namespace Outputs {
export const CacheHit = "cache-hit";
}
export namespace State {
export const CacheKey = "CACHE_KEY";
export const CacheResult = "CACHE_RESULT";
}

@ -0,0 +1,6 @@
export interface ArtifactCacheEntry {
cacheKey?: string;
scope?: string;
creationTime?: string;
archiveLocation?: string;
}

@ -0,0 +1,115 @@
import * as core from "@actions/core";
import { exec } from "@actions/exec";
import * as io from "@actions/io";
import * as fs from "fs";
import * as path from "path";
import * as cacheHttpClient from "./cacheHttpClient";
import { Inputs, State } from "./constants";
import * as utils from "./utils/actionUtils";
async function run() {
try {
// Validate inputs, this can cause task failure
let cachePath = utils.resolvePath(
core.getInput(Inputs.Path, { required: true })
);
core.debug(`Cache Path: ${cachePath}`);
const primaryKey = core.getInput(Inputs.Key, { required: true });
core.saveState(State.CacheKey, primaryKey);
const restoreKeys = core.getInput(Inputs.RestoreKeys).split("\n");
const keys = [primaryKey, ...restoreKeys];
core.debug("Resolved Keys:");
core.debug(JSON.stringify(keys));
if (keys.length > 10) {
core.setFailed(
`Key Validation Error: Keys are limited to a maximum of 10.`
);
return;
}
for (const key of keys) {
if (key.length > 512) {
core.setFailed(
`Key Validation Error: ${key} cannot be larger than 512 characters.`
);
return;
}
const regex = /^[^,]*$/;
if (!regex.test(key)) {
core.setFailed(
`Key Validation Error: ${key} cannot contain commas.`
);
return;
}
}
try {
let archivePath = path.join(
await utils.createTempDirectory(),
"cache.tgz"
);
core.debug(`Archive Path: ${archivePath}`);
const cacheEntry = await cacheHttpClient.getCacheEntry(keys);
// Store the cache result
utils.setCacheState(cacheEntry);
// Download the cache from the cache entry
await cacheHttpClient.downloadCache(cacheEntry, archivePath);
io.mkdirP(cachePath);
// http://man7.org/linux/man-pages/man1/tar.1.html
// tar [-options] <name of the tar archive> [files or directories which to add into archive]
const args = ["-xz"];
const IS_WINDOWS = process.platform === "win32";
if (IS_WINDOWS) {
args.push("--force-local");
archivePath = archivePath.replace(/\\/g, "/");
cachePath = cachePath.replace(/\\/g, "/");
}
args.push(...["-f", archivePath, "-C", cachePath]);
const tarPath = await io.which("tar", true);
core.debug(`Tar Path: ${tarPath}`);
const archiveFileSize = fs.statSync(archivePath).size;
core.debug(`File Size: ${archiveFileSize}`);
await exec(`"${tarPath}"`, args);
const isExactKeyMatch = utils.isExactKeyMatch(
primaryKey,
cacheEntry
);
utils.setCacheHitOutput(isExactKeyMatch);
core.info(
`Cache restored from key:${cacheEntry && cacheEntry.cacheKey}`
);
try {
core.info("Cache Checksum:");
await exec(`md5sum`, [`${archivePath}`]);
} catch (error) {
core.debug(`Failed to checkum with ${error}`);
}
} catch (error) {
core.warning(error.message);
utils.setCacheHitOutput(false);
}
} catch (error) {
core.setFailed(error.message);
}
}
run();
export default run;

@ -0,0 +1,83 @@
import * as core from "@actions/core";
import { exec } from "@actions/exec";
import * as io from "@actions/io";
import * as fs from "fs";
import * as path from "path";
import * as cacheHttpClient from "./cacheHttpClient";
import { Inputs, State } from "./constants";
import * as utils from "./utils/actionUtils";
async function run() {
try {
const state = utils.getCacheState();
// Inputs are re-evaluted before the post action, so we want the original key used for restore
const primaryKey = core.getState(State.CacheKey);
if (!primaryKey) {
core.warning(`Error retrieving key from state.`);
return;
}
if (utils.isExactKeyMatch(primaryKey, state)) {
core.info(
`Cache hit occurred on the primary key ${primaryKey}, not saving cache.`
);
return;
}
let cachePath = utils.resolvePath(
core.getInput(Inputs.Path, { required: true })
);
core.debug(`Cache Path: ${cachePath}`);
let archivePath = path.join(
await utils.createTempDirectory(),
"cache.tgz"
);
core.debug(`Archive Path: ${archivePath}`);
// http://man7.org/linux/man-pages/man1/tar.1.html
// tar [-options] <name of the tar archive> [files or directories which to add into archive]
const args = ["-cz"];
const IS_WINDOWS = process.platform === "win32";
if (IS_WINDOWS) {
args.push("--force-local");
archivePath = archivePath.replace(/\\/g, "/");
cachePath = cachePath.replace(/\\/g, "/");
}
args.push(...["-f", archivePath, "-C", cachePath, "."]);
const tarPath = await io.which("tar", true);
core.debug(`Tar Path: ${tarPath}`);
await exec(`"${tarPath}"`, args);
const fileSizeLimit = 200 * 1024 * 1024; // 200MB
const archiveFileSize = fs.statSync(archivePath).size;
core.debug(`File Size: ${archiveFileSize}`);
if (archiveFileSize > fileSizeLimit) {
core.warning(
`Cache size of ${archiveFileSize} bytes is over the 200MB limit, not saving cache.`
);
return;
}
const stream = fs.createReadStream(archivePath);
await cacheHttpClient.saveCache(stream, primaryKey);
try {
core.info("Cache Checksum:");
await exec(`md5sum`, [`${archivePath}`]);
} catch (error) {
core.debug(`Failed to checkum with ${error}`);
}
} catch (error) {
core.warning(error.message);
}
}
run();
export default run;

@ -0,0 +1,81 @@
import * as core from "@actions/core";
import * as io from "@actions/io";
import * as os from "os";
import * as path from "path";
import * as uuidV4 from "uuid/v4";
import { Outputs, State } from "../constants";
import { ArtifactCacheEntry } from "../contracts";
// From https://github.com/actions/toolkit/blob/master/packages/tool-cache/src/tool-cache.ts#L23
export async function createTempDirectory(): Promise<string> {
const IS_WINDOWS = process.platform === "win32";
let tempDirectory: string = process.env["RUNNER_TEMP"] || "";
if (!tempDirectory) {
let baseLocation: string;
if (IS_WINDOWS) {
// On Windows use the USERPROFILE env variable
baseLocation = process.env["USERPROFILE"] || "C:\\";
} else {
if (process.platform === "darwin") {
baseLocation = "/Users";
} else {
baseLocation = "/home";
}
}
tempDirectory = path.join(baseLocation, "actions", "temp");
}
const dest = path.join(tempDirectory, uuidV4.default());
await io.mkdirP(dest);
return dest;
}
export function isExactKeyMatch(
key: string,
cacheResult?: ArtifactCacheEntry
): boolean {
return !!(
cacheResult &&
cacheResult.cacheKey &&
cacheResult.cacheKey.localeCompare(key, undefined, {
sensitivity: "accent"
}) === 0
);
}
export function setOutputAndState(
key: string,
cacheResult?: ArtifactCacheEntry
) {
setCacheHitOutput(isExactKeyMatch(key, cacheResult));
// Store the cache result if it exists
cacheResult && setCacheState(cacheResult);
}
export function getCacheState(): ArtifactCacheEntry | undefined {
const stateData = core.getState(State.CacheResult);
core.debug(`State: ${stateData}`);
return (stateData && JSON.parse(stateData)) as ArtifactCacheEntry;
}
export function setCacheState(state: ArtifactCacheEntry) {
core.saveState(State.CacheResult, JSON.stringify(state));
}
export function setCacheHitOutput(isCacheHit: boolean) {
core.setOutput(Outputs.CacheHit, isCacheHit.toString());
}
export function resolvePath(filePath: string): string {
if (filePath[0] === "~") {
const home = os.homedir();
if (!home) {
throw new Error("Unable to resole `~` to HOME");
}
return path.join(home, filePath.slice(1));
}
return path.resolve(filePath);
}

@ -0,0 +1,7 @@
function getInputName(name: string): string {
return `INPUT_${name.replace(/ /g, "_").toUpperCase()}`;
}
export function setInput(name: string, value: string) {
process.env[getInputName(name)] = value;
}

@ -0,0 +1,63 @@
{
"compilerOptions": {
/* Basic Options */
// "incremental": true, /* Enable incremental compilation */
"target": "es6", /* Specify ECMAScript target version: 'ES3' (default), 'ES5', 'ES2015', 'ES2016', 'ES2017', 'ES2018', 'ES2019' or 'ESNEXT'. */
"module": "commonjs", /* Specify module code generation: 'none', 'commonjs', 'amd', 'system', 'umd', 'es2015', or 'ESNext'. */
// "allowJs": true, /* Allow javascript files to be compiled. */
// "checkJs": true, /* Report errors in .js files. */
// "jsx": "preserve", /* Specify JSX code generation: 'preserve', 'react-native', or 'react'. */
// "declaration": true, /* Generates corresponding '.d.ts' file. */
// "declarationMap": true, /* Generates a sourcemap for each corresponding '.d.ts' file. */
// "sourceMap": true, /* Generates corresponding '.map' file. */
// "outFile": "./", /* Concatenate and emit output to single file. */
"outDir": "./lib", /* Redirect output structure to the directory. */
"rootDir": "./src", /* Specify the root directory of input files. Use to control the output directory structure with --outDir. */
// "composite": true, /* Enable project compilation */
// "tsBuildInfoFile": "./", /* Specify file to store incremental compilation information */
// "removeComments": true, /* Do not emit comments to output. */
// "noEmit": true, /* Do not emit outputs. */
// "importHelpers": true, /* Import emit helpers from 'tslib'. */
// "downlevelIteration": true, /* Provide full support for iterables in 'for-of', spread, and destructuring when targeting 'ES5' or 'ES3'. */
// "isolatedModules": true, /* Transpile each file as a separate module (similar to 'ts.transpileModule'). */
/* Strict Type-Checking Options */
"strict": true, /* Enable all strict type-checking options. */
"noImplicitAny": false, /* Raise error on expressions and declarations with an implied 'any' type. */
// "strictNullChecks": true, /* Enable strict null checks. */
// "strictFunctionTypes": true, /* Enable strict checking of function types. */
// "strictBindCallApply": true, /* Enable strict 'bind', 'call', and 'apply' methods on functions. */
// "strictPropertyInitialization": true, /* Enable strict checking of property initialization in classes. */
// "noImplicitThis": true, /* Raise error on 'this' expressions with an implied 'any' type. */
// "alwaysStrict": true, /* Parse in strict mode and emit "use strict" for each source file. */
/* Additional Checks */
// "noUnusedLocals": true, /* Report errors on unused locals. */
// "noUnusedParameters": true, /* Report errors on unused parameters. */
// "noImplicitReturns": true, /* Report error when not all code paths in function return a value. */
// "noFallthroughCasesInSwitch": true, /* Report errors for fallthrough cases in switch statement. */
/* Module Resolution Options */
// "moduleResolution": "node", /* Specify module resolution strategy: 'node' (Node.js) or 'classic' (TypeScript pre-1.6). */
// "baseUrl": "./", /* Base directory to resolve non-absolute module names. */
// "paths": {}, /* A series of entries which re-map imports to lookup locations relative to the 'baseUrl'. */
// "rootDirs": [], /* List of root folders whose combined content represents the structure of the project at runtime. */
// "typeRoots": [], /* List of folders to include type definitions from. */
// "types": [], /* Type declaration files to be included in compilation. */
// "allowSyntheticDefaultImports": true, /* Allow default imports from modules with no default export. This does not affect code emit, just typechecking. */
"esModuleInterop": true /* Enables emit interoperability between CommonJS and ES Modules via creation of namespace objects for all imports. Implies 'allowSyntheticDefaultImports'. */
// "preserveSymlinks": true, /* Do not resolve the real path of symlinks. */
// "allowUmdGlobalAccess": true, /* Allow accessing UMD globals from modules. */
/* Source Map Options */
// "sourceRoot": "", /* Specify the location where debugger should locate TypeScript files instead of source locations. */
// "mapRoot": "", /* Specify the location where debugger should locate map files instead of generated locations. */
// "inlineSourceMap": true, /* Emit a single file with source maps instead of having a separate file. */
// "inlineSources": true, /* Emit the source alongside the sourcemaps within a single file; requires '--inlineSourceMap' or '--sourceMap' to be set. */
/* Experimental Options */
// "experimentalDecorators": true, /* Enables experimental support for ES7 decorators. */
// "emitDecoratorMetadata": true, /* Enables experimental support for emitting type metadata for decorators. */
},
"exclude": ["node_modules", "**/*.test.ts"]
}
Loading…
Cancel
Save