8000 the output logits is a str · Issue #113 · evo-design/evo · GitHub
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

the output logits is a str #113

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
xuanwuji opened this issue Feb 7, 2025 · 1 comment
Open

the output logits is a str #113

xuanwuji opened this issue Feb 7, 2025 · 1 comment

Comments

@xuanwuji
Copy link
xuanwuji commented Feb 7, 2025

For network reasons, I loaded the EVO model locally, and my code is as follows:

`
import os
os.environ['TRANSFORMERS_OFFLINE']="1"
import torch
from transformers import AutoConfig, AutoModelForCausalLM
from stripedhyena.tokenizer import CharLevelTokenizer

model_name = './evo-1-131k-base'
device = "cuda:0"

config = AutoConfig.from_pretrained(model_name, trust_remote_code=True, revision="1.1_fix")
model = AutoModelForCausalLM.from_pretrained(
model_name,
config=config,
trust_remote_code=True,
revision="1.1_fix"
)

tokenizer = CharLevelTokenizer(512)

model.to(device)
sequence = 'ACGT'
input_ids = torch.tensor(
tokenizer.tokenize(sequence),
dtype=torch.int,
).to(device).unsqueeze(0)

with torch.no_grad():
logits, _ = model(input_ids) # (batch, length, vocab)

print('Logits: ', logits)
print('Shape (batch, length, vocab): ', logits.shape)
`

But the logits I get when I run it is a string type, does anyone know why this is happening?

my output log is:
Initializing inference params... Logits: logits

@silentpotatos
Copy link

use this code:

output = = model(input_ids) # (batch, length, vocab)
logits = output["logits"]

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants
0