You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Result with: Token indices sequence length is longer than the specified maximum sequence length for this model (1385 > 1024). Running this sequence through the model will result in indexing errors
smart_scraper_graph=SmartScraperGraph(
prompt="List all links",
source="https://www.google.com/search?q=why+it+doesn%27t+work",
schema=Properties,
config=graph_config
)
Hey there, @AceCodePt! 👋 I'm here to help with any bugs, questions, or contributions you may have while waiting for a human maintainer. Let's tackle this together!
There is a similar closed issue regarding the sequence length problem with the ollama/llama3.2 model. The issue was resolved in a pull request that made the model_tokens dictionary key accessible for all models. This fix was included in version 1.27.0-beta.13 and later confirmed to be resolved in version 1.28.0. Updating to version 1.28.0 or later should resolve the issue [1][2].
Result with:
Token indices sequence length is longer than the specified maximum sequence length for this model (1385 > 1024). Running this sequence through the model will result in indexing errors
Originally posted by @AceCodePt in #875 (comment)
The text was updated successfully, but these errors were encountered: