Skip to content

TypeError in low_level_api_chat_cpp.py due to Incorrect Type passed #79

@zatevakhin

Description

@zatevakhin

In the file examples/low_level_api/low_level_api_chat_cpp.py, a wrong type is returned from lines L316-L317.
Returned str, should be llama_token aka c_int.
This issue subsequently causes an error in line L358.

Frequency: Sometimes

argument 2: TypeError: wrong type
  File "/workspace/llama-plugins/interactive.py", line 428, in output
    yield llama_cpp.llama_token_to_str(self.ctx, id).decode("utf-8")
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/workspace/llama-plugins/interactive.py", line 439, in interact
    for i in self.output():
  File "/workspace/llama-plugins/interactive.py", line 494, in <module>
    m.interact()
ctypes.ArgumentError: argument 2: TypeError: wrong type

P.S. Line numbers in the snippet above are incorrect due to the applied code formatting tool.

Here is two ways to fix it.

  1. Return EOS
if len(self.embd) > 0 and self.embd[-1] == llama_cpp.llama_token_eos():
    if not self.params.instruct:
        yield llama_cpp.llama_token_eos()
    break
  1. Tokenize string " [end of text]\n", and return tokens.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions

      pFad - Phonifier reborn

      Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

      Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


      Alternative Proxies:

      Alternative Proxy

      pFad Proxy

      pFad v3 Proxy

      pFad v4 Proxy