Find answers from the community

Home
Members
abc8282
a
abc8282
Offline, last seen 3 months ago
Joined September 25, 2024
a
abc8282
·

Graphs

can i create ComposableGraph on top of ComposableGraphs of some indices? if not, is there any alternative to create hierarchical structure of indices?
2 comments
L
a
abc8282
·

````

Plain Text
`
Setting `pad_token_id` to `eos_token_id`:50256 for open-end generation.
/usr/local/lib/python3.9/dist-packages/transformers/generation/utils.py:1313: UserWarning: Using `max_length`'s default (300) to control the generation length. This behaviour is deprecated and will be removed from the config in v5 of Transformers -- we recommend using `max_new_tokens` to control the maximum length of the generation.
  warnings.warn(
Input length of input_ids is 3635, but `max_length` is set to 300. This can lead to unexpected behavior. You should consider increasing `max_new_tokens`.
---------------------------------------------------------------------------
IndexError                                Traceback (most recent call last)
<ipython-input-53-0f83071d088d> in <cell line: 1>()
----> 1 response = index.query("What did the author do growing up?")
      2 print(response)
---------------------------42 frames------------------------------
/usr/local/lib/python3.9/dist-packages/torch/nn/functional.py in embedding(input, weight, padding_idx, max_norm, norm_type, scale_grad_by_freq, sparse)
   2208         # remove once script supports set_grad_enabled
   2209         _no_grad_embedding_renorm_(weight, input, max_norm, norm_type)
-> 2210     return torch.embedding(weight, input, padding_idx, scale_grad_by_freq, sparse)
   2211 
   2212 

IndexError: index out of range in self

Any idea about this warning and error?
3 comments
a
L