8000 update torch version to 2.7 by tonyhoo · Pull Request #4095 · fastai/fastai · GitHub
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

update torch version to 2.7 #4095

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
May 23, 2025
Merged

update torch version to 2.7 #4095

merged 1 commit into from
May 23, 2025

Conversation

tonyhoo
Copy link
Contributor
@tonyhoo tonyhoo commented May 23, 2025

This PR updates the PyTorch version constraint in settings.ini to allow compatibility with PyTorch 2.7.

Changes

  • Updated pip_requirements from torch>=1.10,<2.7 to torch>=1.10,<2.8
  • Updated conda_requirements from pytorch>=1.10,<2.7 to pytorch>=1.10,<2.8

Testing

I've tested this change locally with PyTorch 2.7.0 and confirmed basic functionality works:

  • Model training with vision_learner
  • DataLoaders functionality
  • Basic operations with tensors

Motivation

This change is needed to allow projects that depend on fastai (such as AutoGluon) to use the latest PyTorch versions without version conflicts.

@tonyhoo tonyhoo requested a review from jph00 as a code owner May 23, 2025 16:08
@warner-benjamin
Copy link
Collaborator

Was going to submit a PR with the same change. Also confirm that all of the tests pass upgrading to PyTorch 2.7 with Cuda 12.8.

@jph00
Copy link
Member
jph00 commented May 23, 2025

Thank you!

@jph00 jph00 merged commit 9178dd1 into fastai:main May 23, 2025
28 checks passed
@tonyhoo tonyhoo deleted the torch_update branch May 25, 2025 00:45
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants
0