-
Notifications
You must be signed in to change notification settings - Fork 17.3k
Conversation
docs/rfcs/xxx-pretranspile.md
Outdated
|
||
## Motivation | ||
|
||
Transpiling packages on _publish_ rather than _load_ will have great benefits for package authors: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
does transpiling on publish, rather than on load, also have the benefit of making test coverage simpler (for package authors who want to set that up?)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not necessarily. I haven't added anything to this about packages running in dev mode -- for the moment, dev mode, including the test runner, will still use Atom's transpiler pipeline. Although you could set up your package.json
to prepare
before running tests... ?
Istanbul also has a problem with Electron subprocesses not inheriting the code it's trying to inject properly which this won't address.
docs/rfcs/xxx-pretranspile.md
Outdated
|
||
## Summary | ||
|
||
This feature will enable package authors to use conventional npm tooling and package.json conventions to take advantage of JavaScript transpilers like Babel or TypeScript. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Separate issue, but would we be able to use TypeScript in the GitHub package?
Why yes. Yes we would. 😁 Although I think Relay will give us problems when we do.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We definitely need to do this but I think native Node modules are going to throw a huge wrench in the works. I'm happy to chat about ideas for how we can solve that!
docs/rfcs/xxx-pretranspile.md
Outdated
|
||
## Unresolved questions | ||
|
||
Do we want to deprecate transpilation-on-demand for local development, as well? It may add a bit of friction for package development, but transpilers like TypeScript tend to offer a `--watch` option to transpile live, and it would let us eliminate a lot of complexity in the way Atom loads JavaScript. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Over time, definitely. Having a more "typical" JS dev workflow would simplify a lot of things in Atom and remove some magic that causes problems.
docs/rfcs/xxx-pretranspile.md
Outdated
|
||
## Drawbacks | ||
|
||
Doing this makes installing a package in production more different than loading it during development. This increases the number of variables that can cause issues between local development and the production of an `apm publish` artifact, like tweaking your `.npmignore` file properly. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
One big problem is native Node modules. If packages can now ship pre-compiled tarballs, we'd have to include the right native bits for macOS, Linux, and Windows 32-bit and 64-bit. On top of that, you'll need to have separate builds for each Electron/Node.js ABI version that the package could possibly support, especially as we start upgrading Electron versions more frequently.
The prebuild
module takes care of this for normal node modules by creating binary tarballs for each platform and ABI version but it quickly spirals out of control: https://github.com/daviwil/node-pty-prebuilt/releases/tag/v0.7.3. It's also unreasonable to expect the package developer to build these bindings across the 3 supported platforms.
I hesitate to say this, but we might have to provide our own precompilation service which does this on behalf of the package author when they publish an update. Like, a publish would have to kick off a job on some infrastructure to build the code and handle publishing the tarballs wherever they live. Not fun...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If packages can now ship pre-compiled tarballs, we'd have to include the right native bits for macOS, Linux, and Windows 32-bit and 64-bit.
The tarball created by npm pack
does not include anything from transitive dependencies, including binaries:
smashwilson @ hubtop ~/src/atom/github (aw/split-github-tab %=) [1]
$ find . -name keytar.node
./node_modules/keytar/build/Release/keytar.node
smashwilson @ hubtop ~/src/atom/github (aw/split-github-tab %=)
$ tar ztvf github-0.18.0-1.tgz | grep 'keytar.node'
smashwilson @ hubtop ~/src/atom/github (aw/split-github-tab %=) [1]
$
apm install
would continue to install those from the appropriate packages in the npm registry, including any that contain native code. Native dependencies would be installed from the appropriate prebuilts or compiled on the user's system, just as they are today.
The only difference would be that apm packages with top-level native dependencies (like, a binding.gyp
and C++ source in the Atom package itself, not used through an npm dependency) would now be able to distribute prebuilt binaries. Setting that up would be a bit of an adventure, though... personally I'd still prefer to extract the native bits to a different npm package and let other people handle the prebuilt
setup for me 😉.
I hesitate to say this, but we might have to provide our own precompilation service which does this on behalf of the package author when they publish an update.
That would be kind of cool to offer 🤔. I know the experience of, say, Windows users who want to install an Atom package with native dependencies is not ideal - I believe they need to install Node and hunt down native build tools and such, correct?
I think that would be a separate initiative, though.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, that's the biggest problem that Windows users have, they need to install the C++ compiler stuff and Python 2.4 just to get a terminal package (without prebuilds) to install correctly. What you say sounds fine to me though, if we're still running apm install
on the package after unpacking the tarball I think it'll be fine.
docs/rfcs/xxx-pretranspile.md
Outdated
* The transition path for existing users of apm and atom.io is not as smooth. | ||
* Easier to typo `apm` and `npm` commands and have an undesirable outcome. | ||
|
||
## Unresolved questions |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Another unresolved question: Should we leverage S3 instead of GitHub Releases so that we can put these tarballs behind a CDN?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That's probably a better idea, yes. A little bit more infrastructure work but worth it. I'll update the writeup shortly 👍
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Clicked the wrong button 😬
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Another alternative might be setting up a custom private npm registry instead of (ab)using the public npm one?
docs/rfcs/xxx-pretranspile.md
Outdated
8000 |
|
|
* Standard `npm` tooling like `prepare` scripts will work for apm packages exactly as they work for npm packages. This will remove the need for custom transpiler pipeline modules like [atom-babel6-transpiler](https://github.com/atom/atom-babel6-transpiler) or [atom-typescript-transpiler](https://github.com/smhxx/atom-ts-transpiler) with their own, independent documentation, configuration and setup. | ||
* Packages can move transpiler-related dependencies to `devDependencies` and trim installation bloat substantially. (as a data point, the TypeScript compiler is 30MB.) | ||
* First-time package load will no longer take a hit from transpiling all of the source into the cache. All package loads will benefit from no longer needing to check and side-load from the compiler cache. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is this actually true? I thought that Atom currently transpiled everything down to a very basic ES5 in the compile-cache, the custom transpilation feature just lets you control that. If you're talking about this RFC bringing Atom to running the package code directly as it is shipped that could cause some rather subtle issues with packages as we have seen in the past they sometimes rely (accidentally or on purpose) on quirks of this "always transpiled" path.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I thought that Atom currently transpiled everything down to a very basic ES5 in the compile-cache, the custom transpilation feature just lets you control that.
Nope! JavaScript files have to explicitly opt in to the built-in Babel transpiler with a /** @babel */
comment.
Lines 22 to 27 in e500f93
exports.shouldCompile = function (sourceCode) { | |
var start = sourceCode.substr(0, PREFIX_LENGTH) | |
return PREFIXES.some(function (prefix) { | |
return start.indexOf(prefix) === 0 | |
}) | |
} |
When shouldCompile()
returns false
, the source is passed through and required as-is:
Lines 84 to 99 in e500f93
function compileFileAtPath (compiler, filePath, extension) { | |
var sourceCode = fs.readFileSync(filePath, 'utf8') | |
if (compiler.shouldCompile(sourceCode, filePath)) { | |
var cachePath = compiler.getCachePath(sourceCode, filePath) | |
var compiledCode = readCachedJavaScript(cachePath) | |
if (compiledCode != null) { | |
cacheStats[extension].hits++ | |
} else { | |
cacheStats[extension].misses++ | |
compiledCode = compiler.compile(sourceCode, filePath) | |
writeCachedJavaScript(cachePath, compiledCode) | |
} | |
return compiledCode | |
} | |
return sourceCode | |
} |
If you're talking about this RFC bringing Atom to running the package code directly as it is shipped that could cause some rather subtle issues with packages as we have seen in the past they sometimes rely (accidentally or on purpose) on quirks of this "always transpiled" path.
Eventually we might be able to eliminate the compile cache entirely, at least in non-dev mode, but we couldn't think about that until enough packages transition to using prepare
steps and re-publish as tarballs anyway.
We'll have to leave it in for the foreseeable future to handle old packages, though, so we won't be able to get the performance boost of not having to redundantly slurp the source code of every module when it's required. I'll take that second sentence out, then.
docs/rfcs/xxx-pretranspile.md
Outdated
|
||
### Package publishing | ||
|
||
During the `apm publish` call, apm will invoke [`npm pack`](https://docs.npmjs.com/cli/pack) to run all standard npm lifecycle hooks and prepare a `.tar.gz` file. apm then creates a release associated with the created tag and uploads the `.tar.gz` file as an attachment called `package.tar.gz`. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Will more "manual" workflows still be supported? For example right now if you want to create a signed release tag you need to do everything manually.
I'm guessing that as long as the package.tar.gz
is attached and the version fits the appropriate format this will be the case, but it might be better to explicitly state this.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You mean doing the git steps manually and using apm publish --tag
to upload it to atom.io? Yeah, that should still work fine. I'll add a note 🙇
There was a problem hiding this comment.
You mean doing the git steps manually and using apm publish --tag to upload it to atom.io?
Yep, exactly.
This idea was tossed around early in Atom's life, actually. I believe we didn't go that direction because we lacked the resources to admin and do ops on a CouchDB database properly? We could always re-evaluate, but offhand, it feels like going the private registry direction is a much larger delta from what we have today. I was trying to write this up as a "quick fix" I could implement in a fairly short amount of time. |
Makes perfect sense! I just thought of it as an alternative and thought I would bring it up 😉. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not approved
2728a22
to
af6da3d
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We would love to work on this someday in the future.
Poking around in apm and npm's internals again sparked this idea. This is a proposal for what I think is the simplest possible way to support pre-transpiled apm packages.
✨ 🖨rendered 🖨️ ✨