HALLUCINATION #556
Replies: 2 comments 2 replies
-
Hi, I am facing with same situation. Can you solve the problem ? I am using local LLM gemma2:27b. |
Beta Was this translation helpful? Give feedback.
-
hi,bro,i meet this problem too,put my code below,looks like this solved the problem,i'm chinese, so I use Chinese
` Here is the complete code , i use flask example code ` load_dotenv() from functools import wraps app = Flask(name, static_url_path='') 读取配置文件config = configparser.ConfigParser() 提取数据库连接参数db_config = config['mysql'] olllama 配置ollama_config = config['ollama'] SETUPcache = MemoryCache() 简单的中文停用词表,可以根据需要扩展stop_words = ["的", "了", "是", "在", "和", "也", "就", "都", "很"] 获取查询语句中的中文关键字def extract_keywords(text): class MyVanna(ChromaDB_VectorStore, Ollama):
使用从配置文件读取的参数vn = MyVanna(config={ }) 获取表的元信息,包括注释df_tables = vn.run_sql( print("df_tables:", df_tables)获取列的元信息,包括注释df_columns = vn.run_sql( 打印最终结果print("df_columns:", df_columns)ddl_statements = [] for table_name, table_comment in zip(df_tables['TABLE_NAME'], df_tables['TABLE_COMMENT']):
def requires_cache(fields):
@app.route('/api/v0/generate_questions', methods=['GET']) @app.route('/api/v0/generate_sql', methods=['GET'])
@app.route('/api/v0/run_sql', methods=['GET'])
@app.route('/api/v0/download_csv', methods=['GET'])
@app.route('/api/v0/generate_plotly_figure', methods=['GET'])
@app.route('/api/v0/get_training_data', methods=['GET'])
@app.route('/api/v0/remove_training_data', methods=['POST'])
@app.route('/api/v0/train', methods=['POST'])
@app.route('/api/v0/generate_followup_questions', methods=['GET'])
@app.route('/api/v0/load_question', methods=['GET'])
@app.route('/api/v0/get_question_history', methods=['GET']) @app.route('/') def append_to_json_file(filename, new_data):
定义训练数据的函数def train_data():
if name == 'main':
` |
Beta Was this translation helpful? Give feedback.
-
we are facing a critical hallucination problem where the generated sql is having hallucinated tables and columns. there tables and columns are not available in my schema. how do I tackle that? I am using gpt-3.5-turbo-16k
Beta Was this translation helpful? Give feedback.
All reactions