
Today, we are announcing two new ways to protect your npm account. Please read on to learn how you can use these security features to keep your code safe and increase everyone’s trust in the more than 550,000 packages of code in the npm Registry.
Two-factor authentication (2FA)
Now, you can sync your npm account with an authentication application like Google Authenticator or Authy. When you log in, you’ll be prompted for a single-use numeric code generated by the app.
2FA is another layer of defense for your account, preventing third parties from altering your code even if they steal or guess your credentials. This is one of the easiest and most important ways to ensure that only you can access to your npm account.
Read-only tokens
If your continuous integration / continuous deployment (CI/CD) workflow includes linking your npm account to tools like Travis CI with authentication tokens, you can now create read-only tokens for tools that don’t need to publish. You can also restrict tokens to work from only specified ranges of IP addresses.
Even if your token is compromised — for example, if you accidentally commit it to GitHub — no one else can alter your code, and only authorized CI servers will be able to download your code.
Set these up now (please)
The npm community is now larger than the population of New York City, so it’s never been more important to reinforce trust and encourage collaboration. Every developer who secures their npm account with these new methods helps ensure the safety and integrity of the code we all discover, share, and reuse to build amazing things.
Learn how to activate 2FA in this doc:
Using Two-Factor Authentication
Watch this space
There has never been a major security incident caused by leaked npm credentials, but our security work is never finished. We work continuously to protect the npm Registry and detect and remove malicious code, and we try to keep you informed of our efforts.
If you ever believe you’ve encountered any malicious code on the Registry or in npm itself, contact us right away [using the npm website](lgt security contact form) or by emailing [email protected]. If you have any feedback or questions about what we’ve rolled out today, just contact [email protected].
Thanks for helping keep the npm community safe.
UPDATE: To try out TFA, you’ll need version [email protected] or newer of the npm client. To get it, run `npm install npm@latest -g`.

Editor’s note: This is a guest post from Adam Baldwin of ^Lift Security and the Node Security Platform. As we discussed in earlier posts, Adam conducts constant security reviews of the Registry and its contents and keeps us appraised of anything that might compromise our security.
Over the years I’ve spent a lot of time digging through the half million public packages on npm. There are millions of public tarballs to go with those public packages, which means there’s a lot of code to look through and experiment with. Buried in that code is a surprising amount of sensitive information: authentication tokens, passwords, and production test data including credit card numbers.
You, as a developer publishing to npm, want to avoid leaking your data like this. I’ll share some tips for how to control what you’re publishing and keep your secrets out of the public registry.
Let’s explore the behavior of npm publish, because understanding how it chooses which files to include is critical to controlling what gets published. If you want to dive in deeper, the npm documentation goes into more detail, but I’ll cover the important points here.
When you run npm publish, npm bundles up all the files in the current directory. It makes a few decisions for you about what to include and what to ignore. To make these decisions, it uses the contents of several files in your project directory. These files include .gitignore, .npmignore, and the files array in the package.json. It also always includes certain files and ignores others.
npm will always include these files in a package:
package.jsonREADME and its variants like README.mdCHANGELOG and its variantsLICENSE and the alternative spelling LICENCEnpm will always ignore these file name patterns:
.*.swp._*.DS_Store.git, hg , .svn, CVS version control directories.npmrc.lock-wscript.wafpickle-*config.gypinpm-debug.logOne of the most common ways to exclude files and folders is to specify them in a .gitignore file. This is because files you do not want to commit to your repository are also typically files you do not want to be published.
npm also honors a file called .npmignore, which behaves exactly the same as .gitignore. These files are not cumulative. Adding an .npmignore file to your project replaces .gitignore entirely. If you try to use both, you will inadvertently publish a file you thought you had excluded.
This is how that might happen:
production.json configuration file to your .gitignore file because it contains sensitive information..npmignore but forget to add any files to it.npm publish..npmignore exists, it is consulted instead of .gitignore, but it includes no files to ignore!production.json file is therefore published, and your sensitive information is leaked.Stick to using .gitignore if you can! If you are using a different version control system, use .npmignore. If you are using git and have an ignore file but wish to publish some of the files you’re not committing— perhaps the result of build steps— start by copying your .gitignore file to .npmignore. Then edit to remove the files you don’t want in git but do want in your package.
There’s an even better way of controlling exactly which files are published with your package: whitelisting with the files array. Only 57,000 packages use this method of controlling what goes into them, probably because it requires you to take inventory of your package. It’s by far the safest way to do it, though.
The files array specifies each file or directory to include in your publish. Only those files are included, plus the ones npm always includes no matter what (such as package.json), minus the ones denied by another rule.
Here’s a package.json file with a files array:
{
"name": "@adam_baldwin/wombats",
"version": "1.0.0",
"description": "",
"main": "index.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"files": [
"index.js"
],
"keywords": [],
"author": "Adam Baldwin <[email protected]> (https://liftsecurity.io)",
"license": "ISC"
}
No matter what other files exist in this project directory during npm publish, only the index.js file will be packed up into the tarball (plus the readme and package.json, of course!)
You can use the npm-packlist module to programmatically get a list of the files npm would include for a specific directory. You can also run npm itself to find what it would include. This command lists the files that would get packed up:
tar tvf $(npm pack)
If you are using private modules in a continuous integration (CI) system you will need to provide that service with an authentication token. In the past you had to provide a regular authentication token out of your .npmrc, which gives your CI system the ability to do everything you can do with your npm account. Now it’s possible to generate a read-only token that can limit the damage if a token is leaked via CI.
This feature isn’t yet supported by the npm CLI, but you can use the public registry API to generate a token by hand. Here’s an example with curl:
curl -u [USERNAME]:[PASSWORD] https://registry.npmjs.org/-/npm/v1/tokens \
-X POST -H 'content-type: application/json' \
-d '{"password":"[USERNAME]", "readonly": "true"}'
You can review and delete the tokens you have created via the npm website. under Your profile > Tokens.

If you accidentally published a module containing sensitive information, you should consider that data compromised. It would be nice if you could just unpublish the module and hope that nobody saw the mistake, but the reality is as soon as you publish a module it’s replicated to all of the registry mirrors and and other third parties, like ^Lift. The only way to ensure that your services and data aren’t compromised is to invalidate and change any API keys or passwords that were published.
If a secret you can’t change has been leaked, your first step should be to unpublish the package to limit the damage, then take any other actions that are appropriate for the kind of data that leaked. The npm support team will help you unpublish any packages that are outside the 24 hour window for unpublication.
I hope this has been a good refresher on how to help protect what you publish in the registry and ensure that you don’t accidentally leak sensitive data. I encourage you to try using whitelists to control which files go into your next package.
Preventing the mistaken distribution of sensitive information via the npm Registry or any other format requires you and your team to continually educate yourselves on best practices. We will continue to discuss common security issues or mistakes on the npm blog, so be sure to follow along and stay informed on the best ways to secure your code.
This is a small bug fix release wrapping up most of the issues introduced with 5.4.0.
0b28ac72d #18458 Fix a bug on Windows where rolling back of failed optional dependencies would fail. (@marcins)3a1b29991 [email protected] Revert update of write-file-atomic. There were changes made to it that were resulting in EACCES errors for many users. (@iarna)cd8687e12 Fix a bug where if npm decided it needed to move a module during an upgrade it would strip out much of the package.json. This would result in broken trees after package updates.5bd0244ee #18385 Fix npm outdated when run on non-registry dependencies. (@joshclow) (@iarna)We’ve talked about our support policy before and it hasn’t changed but I wanted to take a moment to provide some clarification.
The npm CLI supports running on any version of Node.js currently supported by the Node.js Foundation. That is, we support the most recent version (even if that’s not an LTS release) and we support any version still in maintenance.
With npm@5 we support 4, 6 and 8. That will likely expand to include Node.js 9 if we don’t have an npm@6 by then.
We support the latest release of each major version we support. So that means that at the time of this writing we support v4.8.4, v6.11.3, and v8.4.0. We simply cannot support the huge number of individual releases, particularly when there are often bugs that have been already fixed in them.
We will not drop support for a major version of Node.js without a major version bump in npm itself. This means that npm’s support for a major Node.js version won’t change until sometime after it drops out of maintenance. So, for example, when Node v4 drops out of maintenance in April 2018 npm will continue to support it until its next major version, whatever that may be.

Over the years our legacy APIs have not had rate-limiting built into them, other than the implicit, informal rate limiting caused by performance bottlenecks. Most of the time, for most users of our public APIs, this has been sufficient. As the registry grows, however, we’ve seen heavier use of our APIs, some of which has been heavy enough to prompt us to take action. If we can identify the user and suspect it’s a bug, we reach out. Sometimes we simply block IPs when the usage is at levels that affect other people’s ability to use the APIs.
This isn’t ideal, and we’d rather give clear signals to API users what the allowed rates are and when they are being rate-limited. We’re therefore rolling out more explicit rate-limiting for all registry apis. Your tools should be ready to handle responses with http status code 429, which means that you have exceeded the allowed limit.
We will be allowing logged-in users to make requests at a higher rate than anonymous users.
For some services we have already begun limiting the amount of data that can be requested at a time. For example, we limit package search requests to queries that are at least three characters long. We have also taken steps to prevent API users from exploiting bugs to work around that limit.
We’ve also re-instituted limits in our downloads API. Our previous implementation of this API limited data ranges to well under a year for performance reasons. Our current implementation performs much better but turned out to have its breaking point as well,. You may now request at most 18 months of data at a time for single-package queries. Bulk package data queries are limited to at most 128 packages and 365 days of data.
We reserve the right to further limit API usage without warning when we see a pattern of requests causing the API to be unusable for most callers. We’ll follow up with documentation in these cases. Our primary goal is to prevent API use from either deliberately or accidentally making the service unresponsive for other users.
All of our existing registry API documentation is available in the registry repo on GitHub, and you can find the most up-to-date statements about our rate-limiting policies there.

npm’s newest wombat is… an actual wombat. Teacup is a female wombat joey being nursed and raised at the Sleepy Burrows Wombat Sanctuary in Gundaroo, Australia. When npm adopted her shortly after she arrived at Sleepy Burrows in July, Teacup weighed just under 200 grams (7 oz.), but her caretakers have done their best to simulate life in her mother’s pouch, providing milk and rubbing her with hemp oil (be warned: cuteness ahead!) to help her regulate body temperature.
A month and a half later, Teacup has grown to over 800 grams (28 oz.), is growing a healthy layer of hair and is learning how to walk. We will be paying very close attention as she matures, because now “watching baby wombat videos” counts as legitimate workday activity. Stay tuned for updates!
If you’d like to support the Sleepy Burrows Wombat Sanctuary, you can learn more, and lose the next week of your life watching all the videos, on their website.
npm, Inc. and I will continue to throw our weight behind our values, including diversity and inclusivity in the Node.js project. I am encouraged to see that the Node.js Foundation board also recognizes the importance of these values, and is taking steps to correct the failures in project governance that risked calling their commitment into doubt.
There is tremendous risk if the Node.js Foundation doesn’t decisively expand its community of open source contributors. The Node.js ecosystem is larger than ever. Its continued growth depends on technical innovation, and innovation requires a healthy culture. Any project will suffer without contributions from a broad selection of its members, and any project will lose relevance if its leaders don’t actively promote inclusive conduct.
Node.js developers are an extremely diverse community who care deeply about inclusivity, and are not shy about expressing themselves through direct action. The Node.js project is stronger when they speak up.
I am confident that the leaders of Node.js Foundation will take the right actions to put this challenge behind us. It isn’t the first time that the community has spoken up about its needs, and I hope it isn’t the last. I am extremely proud of the community we’ve all built together, and excited to see it continue to grow and mature.

This piece is a part of our Customer Convos series. We’re sharing stories of how people use npm at work. Want to share your thoughts? Drop us a line.
Q: Hi! Can you state your name and what you do?
A: Hi! I’m Jan and I’m an iOS developer at Clue.
How’s your day going?
Chilling with my cat, so purrrrrretttty good.
Tell me the story of npm at your company.
Our products are two mobile apps for iOS and Android. We write some logic in JavaScript so that we don’t have to do it twice on both platforms and can share it. Using a real package manager to handle that instead of someone going from time to time “hey, we should maybe update the JS in the apps, huh?” is pretty nice.
Can you tell us a story about a specific package you wanted to make that private packages really enabled you to do?
Some of the core logic of our app would be really error-prone and tricky to re-write for each of our platforms — but it has a bunch of proprietary logic, so its being private was a must-have.
Does your company do open source? How do you negotiate what you keep private and public?
We do have a GitHub org and few repos up there with little helper things that we’ve built over the years, but it’s not an important part of our work. We’ve recently been talking internally about carving out bits and pieces that would be useful in broader contexts and open-sourcing those, but nothing concrete yet.
To people who are unsure what they could use private packages for, how would you explain the use case?
By making analogy to GitHub private orgs/repos. You know how your source code is in a private repo? Well, the build artifacts of your JS library can be, too!
How’s the day-to-day experience of using private packages?
Pretty seamless! I have few nitpicks about the web interface (getting to the private package takes way too many clicks, and I’d love to see a version history), but otherwise I can’t say I’ve noticed any problems.
Oh! there was an issue earlier this year when the person who set up the org left the company. I remember people complaining about the process of transferring the ownership being a PITA, but I wasn’t super involved with that, so I don’t really remember the specifics…
Editor’s note: We’re always happy to help! If you have any issues, please reach out to support at [email protected].
Would you recommend that another org or company use private packages or orgs? Why?
Yes. “Please stop copy-pasting files between repos.”
Any questions I didn’t ask that you wish I did?
Nyup, I think you got it covered.
Any cool npm stuff your company has done publicly that you’d like to promote?
Sadly not…
Here’s another small big release, with a handful bunch of fixes and a couple of small new features! This release has been incubating rather longer than usual and it’s grown quite a bit in that time. I’m also excited to say that it has contributions from 27 different folks, which is a new record for us. Our previous record was 5.1.0 at 21. Before that the record had been held by 1.3.16 since December of 2013.

If you can’t get enough of the bleeding edge, I encourage you to check out our canary release of npm. Get it with npm install -g npmc. It’s going to be seeing some exciting stuff in the next couple of weeks, starting with a rewriten npm dedupe, but moving on to… well, you’ll just have to wait and find out.
d080379f6 [email protected] Updates extract to use tar@4, which is much faster than the older tar@2. It reduces install times by as much as 10%. (@zkat)4cd6a1774 0195c0a8c #16804 [email protected] Update publish to use tar@4. tar@4 brings many advantages over tar@2: It’s faster, better tested and easier to work with. It also produces exactly the same byte-for-byte output when producing tarballs from the same set of files. This will have some nice carry on effects for things like caching builds from git. And finally, last but certainly not least, upgrading to it also let’s us finally eliminate fstream—if you know what that is you’ll know why we’re so relieved. (@isaacs)1ac470dd2 #10382 If you make a typo when writing a command now, npm will print a brief “did you mean…” message with some possible alternatives to what you meant. (@watilde)20c46228d #12356 When running lifecycle scripts, INIT_CWD will now contain the original working directory that npm was executed from. Remember that you can use npm run-script even if you’re not inside your package root directory! (@MichaelQQ)be91e1726 4e7c41f4a [email protected]: Fixes a number of issues on Windows and adds support for several more languages: Korean, Norwegian (bokmål and nynorsk), Ukrainian, Serbian, Bahasa Indonesia, Polish, Dutch and Arabic. (@zkat)2dec601c6 #17142 Add the new commit-hooks option to npm version so that you can disable commit hooks when committing the version bump. (@faazshift)bde151902 #14461 Make output from npm ping clear as to its success or failure. (@legodude17)b6d5549d2 #17844 Make package-lock.json sorting locale-agnostic. Previously, sorting would vary by locale, due to using localeCompare for key sorting. This’ll give you a little package-lock.json churn as it reshuffles things, sorry! (@LotharSee)44b98b9dd #17919 Fix a crash where npm prune --production would fail while removing .bin. (@fasterthanlime)c3d1d3ba8 #17816 Fail more smoothly when attempting to install an invalid package name. (@SamuelMarks)55ac2fca8 #12784 Guard against stack overflows when marking packages as failed. (@vtravieso)597cc0e4b #15087 Stop outputting progressbars or using color on dumb terminals. (@iarna)7a7710ba7 #15088 Don’t exclude modules that are both dev & prod when using npm ls --production. (@iarna)867df2b02 #18164 Only do multiple procs on OSX for now. We’ve seen a handful of issues relating to this in Docker and in on Windows with antivirus. (@zkat)23540af7b #18117 Some package managers would write spaces to the _from field in package.json’s in the form of name @spec. This was causing npm to fail to interpret them. We now handle that correctly and doubly make sure we don’t do that ourselves. (@IgorNadj)0ef320cb4 #16634 Convert any bin script with a shbang a the start to Unix line-endings. (These sorts of scripts are not compatible with Windows line-endings even on Windows.) (@ScottFreeCode)71191ca22 #16476 [email protected] Running an install with --ignore-scripts was resulting in the the package object being mutated to have the lifecycle scripts removed from it and that in turn was being written out to disk, causing further problems. This fixes that: No more mutation, no more unexpected changes. (@addaleax)459fa9d51 npm/read-package-json#74 #17802 [email protected] Use unix-style slashes for generated bin entries, which lets them be cross platform even when produced on Windows. (@iarna)5ec72ab5b #18229 Make install.sh find nodejs on debian. (@cebe)b019680db #10846 Remind users that they have to install missing peerDependencies manually. (@ryanflorence)3aee5986a #17898 Minor punctuation fixes to the README. (@AndersDJohnson)e0d0a7e1d #17832 Fix grammar, format, and spelling in documentation for run-script. (@simonua)3fd6a5f2f #17897 Add more info about using files with npm pack/npm publish. (@davidjgoss)f00cdc6eb #17785 Add a note about filenames for certificates on Windows, which use a different extension and file type. (@lgp1985)0cea6f974 #18022 Clarify usage for the files field in package.json. (@xcambar)a0fdd1571 #15234 Clarify the behavior of the files array in the package-json docs. (@jbcpollak)cecd6aa5d #18137 Clarify interaction between npmignore and files in package.json. (@supertong)6b8972039 #18044 Corrected the typo in package-locks docs. (@vikramnr)6e012924f #17667 Fix description of package.json in npm-scripts docs. (@tripu)48d84171a f60b05d63 [email protected] Perf improvements. (@zkat)f4650b5d4 [email protected]: Serialize writes to the same file so that results are deterministic. Cleanup tempfiles when process is interrupted or killed. (@ferm10n) (@iarna)96d78df98 80e2f4960 4f49f687b 07d2296b1 a267ab430 #18176 #18025 Move the lifecycle code out of npm into a separate library, npm-lifecycle. Shh, I didn’t tell you this, but this portends to some pretty cool stuff to come very soon now. (@mikesherov)0933c7eaf #18025 Force Travis to use Precise instead of Trusty. We have issues with our couchdb setup and Trusty. =/ (@mikesherov)afb086230 #18138 Fix typos in files-and-ignores test. (@supertong)3e6d11cde #18175 Update dependencies to eliminate transitive dependencies with the WTFPL license, which some more serious corporate lawyery types aren’t super comfortable with. (@zkat)ee4c9bd8a #16474 The tests in test/tap/lifecycle-signal.js, as well as the features they are testing, are partially broken. This moves them from being skipped in CI to being disabled only for certain platforms. In particular, because npm spawns its lifecycle scripts in a shell, signals are not necessarily forwarded by the shell and won’t cause scripts to exit; also, shells may report the signal they receive using their exit status, rather than terminating themselves with a signal. (@addaleax)9462e5d9c #16547 Remove unused file: bin/read-package-json.js (@metux)0756d687d #16550 The build tools for the documentation need to be built/installed before the documents, even with parallel builds. Make has a simple mechanism which was made exactly for that: target dependencies. (@metux)
Recently, there’s been some buzz around the next great architectural shift in systems. There is a rising interest in the evolution of decentralized edge computing as a core part of that shift.
For over two years, npm has been using edge computing concepts to ensure that the developer experience for users of npm Enterprise, our private registry product, matches the experience of using the centralized, cloud-hosted version of the npm Registry.
Here’s why we’re doing that, and how:
Many enterprises have strict requirements that prevent them from using cloud-hosted products for critical parts of their infrastructure. This approach makes sense from a regulatory compliance perspective, but it makes life inconvenient for developers within those companies who wish to take advantage of open-source code from the npm Registry, or who wish to use npm to share and reuse their own code with their colleagues.

npm Enterprise allows developers at big companies to run a version of the npm Registry behind their firewall. Of course, it wouldn’t be enough for enterprise customers to simply deploy a fresh install of npm Enterprise with an empty registry. Much of npm’s value comes from the 500,000 packages available on the public registry, and being able to combine these packages with private code. Without access to these packages, developers would waste time reinventing a lot of wheels.
npm Enterprise lets companies mix public packages from the public registry with private code stored within their private registry without risks or complexity.
We designed npmE so that each npmE server is a private edge node to the public registry. Each npmE instance can replicate select parts or all of the npm Registry to offer this functionality to end users. It also provides additional local services that are only accessible to these users, based on a company’s unique requirements.

Our customers are able to configure their private registry as a full mirror of the public Registry to decrease latency, cut bandwidth costs, and offer npm Registry to end users who are restricted from accessing the public internet. Alternately, they may selectively mirror npm’s Registry using specific whitelists managed by the admin console or the npm command line.
When combined with npmE add-ons which enforce code quality, evaluate how packages and their dependencies are licensed, and scan for security vulnerabilities, this architecture gives companies total control of the public packages their developers may use.
At the same time, these developers may find, share, and re-use proprietary code by publishing to their private local registry. Private code stays private by never leaving the company’s own infrastructure.
End users don’t have to think about where each package is located; they can just pull from the npm Enterprise server. Behind the scenes, the server automatically determines the scope and proxy of the package pull.

There are two primary challenges of an edge node architecture:
For many years, deploying and maintaining a private instance of this kind of architecture would have been prohibitively difficult for everyone but the most advanced IT organizations. These sorts of enterprise software installations took several months to implement and involved manual processes of configuring servers, runtimes, and components. Every enterprise IT org would have taken responsibility for the ops role of their enterprise instance.
Fortunately, it’s now much easier to deploy and maintain private edge nodes thanks to technologies like containerization, orchestration and scheduling platforms.
Deployment and management are now baked into the design and development effort of modern applications. Generally, this creates reproducible and consistent cloud-native deployments, but it also is becoming the foundation of modern enterprise software deployments. This automation and inherent portability allows our customers to deploy into their own environments without deep knowledge of our architecture.
Of course, it isn’t quite as easy and magical as it all sounds. We initially built out our own containerized installation methods by packing all of our services into a single container. This approach still required npm Enterprise customers to be quite technical to complete the deployment. The system also lacked the tooling for managing versions, customers, backups, and updates.
After a few months of banging our heads against the wall, we decided that we were dedicating too many resources to deploying and managing of Enterprise instances. We switched over to a platform called Replicated to be our enterprise version management platform. Replicated’s platform provides workflows for integrating our existing CI/CD release pipeline with our enterprise release channels.
Similar to the way that npm packages are versioned for automatic updates, we use discrete versioned images for each of the services that make up npm Enterprise. We organize these into specific channels provided — “stable,” “beta”, and “unstable” — and when we promote a set of images to “stable,” Replicated automatically notifies our customers that an update to npm Enterprise is available and makes the update as simple as a single click. Our customers don’t have to manually update services, and we don’t have to manually push containers around in order to keep the edge nodes of the npm Registry on recent versions.

Beyond deployment and management, there also was the problem of developing enterprise-specific features such as change management processes, LDAP integration, and an admin dashboard, which enterprise customers require but which fall outside our core product expertise. Many of these features are included (or at least made easier) in the Replicated platform and provide a consistent experience that enterprise IT admins are now familiar with.
These sorts of enterprise ready features are important to our Enterprise customers, but since they aren’t a core part of our value proposition, it has made a ton of sense for us to leverage a partner to power these as much as possible.
The state of art in edge node architecture is still evolving, but it is gaining more traction in a variety of use cases. An increasing quantity of JS developers rely on npm, and as a result, an increasing number of enterprises will need npm Enterprise. For developers to be effective it’s imperative that they benefit from the global Registry of npm packages.
By partnering with Replicated to pioneer an architecture that delivers on that promise while reducing management overhead and satisfying security requirements, we can see an emerging future that embraces the distributed nature of the internet. To learn more about Replicated’s technology, visit their site.
npm’s core and enterprise offerings are constantly improving.To try out a fully private, enterprise edge-node instance of npm, just reach out or download a free trial today.