Did not find openai_api_key after SO upgrading

Comments

5 comments

  • Avatar
    Gianfranco Campana

    Solved.
    I simply pulled an older version, and then pulled again the last one. Working again.
    No idea why was that.

    0
    Comment actions Permalink
  • Avatar
    Candido Dessanti

    Hi Gianfranco,

     

    Thank you for resolving the issue. I'm curious about the steps you took to get and to address the issue.
    Did you perform an in-place upgrade of the OS, causing the Docker version of HeavyIQ to stop working?
    Also, to restore the original behavior, is it necessary to downgrade the Docker to a version prior to 8.0.1 and then reinstall the 8.0.1 version?

    Candido

    0
    Comment actions Permalink
  • Avatar
    Gianfranco Campana

    Hi Candido,
    here the recap:

    I had a Kubuntu 20.04 LTS with HeavyAI Free Enterprise GPU on Docker v.7.0.4. Everything updated in the system.

    • I pulled the new 8.0 HeavyAI image, still with the old license key in place: HeavyAI 8.0, of course, returned a license error, and it could not be started:
      Server Error: License failure: Your license version is incompatible with the 8.0.1 release. Please contact HEAVY.AI support to request a version compatible license 
    • I solved this issue by pulling the 7.2.4 image, upgrading the license key with the new key I got in the meantime from the website, and pulling again the 8.0 version.
    • While executing:
      docker-compose up -d

    I had:

    ERROR: for jupyterlab-tmp  'ContainerConfig'
    ERROR: for jupyterhub  'ContainerConfig'
    ....

    That's because "docker-compose" is deprecated, please substitute it in the documentation with:

    "docker compose"

    Apart from that, the container was created fine.

    At this point in time, I had a fully working HeavyAI and HeavyIQ, in my Docker instance, in Ubuntu 20.04.
    Then, I performed an in-place upgrade of the OS to 22.04 LTS.

    • The upgrade has been simple and straightforward without errors.
    • After the upgrade all but HeavyIQ was working fine.
      HeavyIQ was returning:
      Value Error: 1 validation error for ChatOpenAI __root__ Did not find openai_api_key, please add an environment variable `OPENAI_API_KEY` which contains it, or pass `openai_api_key` as a named parameter. (type=value_error)
    • The Docker image of HeavyAI didn't have errors, I had access to Immerse and my data and dashboards.
      I always had this in my log:
    2024-06-06T14:27:07.447586558Z Backend TCP:  localhost:6274      
    2024-06-06T14:27:07.447631630Z Backend HTTP: localhost:6278      
    2024-06-06T14:27:07.447642568Z Frontend Web: localhost:6273                        
    2024-06-06T14:27:07.447649533Z Calcite TCP:  localhost:6279      
    2024-06-06T14:27:07.449455769Z - heavydb 28 started              
    2024-06-06T14:27:07.453689483Z - heavy_web_server 31 started    
    2024-06-06T14:27:09.810793486Z ? http server started on [::]:6273
    2024-06-06T14:27:12.455508871Z Navigate to: http://localhost:6273
    2024-06-06T14:27:17.469841005Z HeavyIQ HTTP:  localhost:6275    
    2024-06-06T14:27:17.469889656Z - heavy_iq 58 started
    • At this point I pulled again the 7.2.4 version of the Docker image.
      I made sure everything was working fine.
    • Then I pulled again the 8.0.1 version of the Docker image.
      Now, all is working fine including HeavyIQ.
      The error is gone. 

     

     

    0
    Comment actions Permalink
  • Avatar
    Gianfranco Campana

    And, on the 8.1 Docker version, absolutely randomly, I just got again the same error, without any change in the system: two hours ago the error was not showing up.

    Here the full log I found for HeavyIQ console: I hope it could help.

    Started chromadb server...
    Args:
    --path storage/rag_storage/chromadb
    --port 8009
    See logs at chromadb.log
    Started monitoring chromaDB server process...
    127.0.0.1:53366 - "GET /version.txt HTTP/1.1" 200
    Traceback (most recent call last):
      File "/opt/heavyai/heavyiq/.venv/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
        await app(scope, receive, sender)
      File "/opt/heavyai/heavyiq/.venv/lib/python3.10/site-packages/starlette/routing.py", line 74, in app
        response = await func(request)
      File "<frozen heavyiq.api.routes.log_route>", line 21, in custom_route_handler
      File "/opt/heavyai/heavyiq/.venv/lib/python3.10/site-packages/fastapi/routing.py", line 278, in app
        raw_response = await run_endpoint_function(
      File "/opt/heavyai/heavyiq/.venv/lib/python3.10/site-packages/fastapi/routing.py", line 191, in run_endpoint_function
        return await dependant.call(**values)
      File "<frozen heavyiq.api.routes.iq_lcel_router>", line 73, in auto_query
      File "<frozen heavyiq.api.handlers.decorators>", line 38, in wrapper
      File "<frozen heavyiq.api.handlers.decorators>", line 36, in wrapper
      File "<frozen heavyiq.api.handlers.decorators>", line 61, in wrapper
      File "<frozen heavyiq.api.handlers.lcel_handler>", line 72, in handle_lcel_auto_query_request
      File "/opt/heavyai/heavyiq/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 2109, in ainvoke
        input = await step.ainvoke(
      File "/opt/heavyai/heavyiq/.venv/lib/python3.10/site-packages/langchain_core/runnables/passthrough.py", line 447, in ainvoke
        return await self._acall_with_config(self._ainvoke, input, config, **kwargs)
      File "/opt/heavyai/heavyiq/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 1313, in _acall_with_config
        output = await coro
      File "/opt/heavyai/heavyiq/.venv/lib/python3.10/site-packages/langchain_core/runnables/passthrough.py", line 434, in _ainvoke
        **await self.mapper.ainvoke(
      File "/opt/heavyai/heavyiq/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 2739, in ainvoke
        results = await asyncio.gather(
      File "/opt/heavyai/heavyiq/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 2109, in ainvoke
        input = await step.ainvoke(
      File "/opt/heavyai/heavyiq/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 4081, in ainvoke
        return await self.bound.ainvoke(
      File "/opt/heavyai/heavyiq/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 2109, in ainvoke
        input = await step.ainvoke(
      File "/opt/heavyai/heavyiq/.venv/lib/python3.10/site-packages/langchain_core/runnables/passthrough.py", line 447, in ainvoke
        return await self._acall_with_config(self._ainvoke, input, config, **kwargs)
      File "/opt/heavyai/heavyiq/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 1313, in _acall_with_config
        output = await coro
      File "/opt/heavyai/heavyiq/.venv/lib/python3.10/site-packages/langchain_core/runnables/passthrough.py", line 434, in _ainvoke
        **await self.mapper.ainvoke(
      File "/opt/heavyai/heavyiq/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 2739, in ainvoke
        results = await asyncio.gather(
      File "/opt/heavyai/heavyiq/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 4081, in ainvoke
        return await self.bound.ainvoke(
      File "/opt/heavyai/heavyiq/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 3543, in ainvoke
        return await self._acall_with_config(
      File "/opt/heavyai/heavyiq/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 1313, in _acall_with_config
        output = await coro
      File "/opt/heavyai/heavyiq/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 3490, in _ainvoke
        output = await acall_func_with_variable_args(
      File "<frozen heavyiq.lcel.chains.heavydb.table_chain>", line 60, in get_and_retrieve_table_info
      File "<frozen heavyiq.lcel.chains.heavydb.table_chain>", line 51, in get_tables
      File "<frozen heavyiq.lcel.chains.heavydb.table_chain>", line 46, in retrieve_table_names_from_vectordb
      File "<frozen heavyrag.main>", line 3, in <module>
      File "<frozen heavyrag.main>", line 13, in <module>
      File "<frozen heavyrag.evaluate>", line 3, in <module>
      File "<frozen heavyrag.evaluate>", line 8, in <module>
      File "<frozen heavyrag.llm>", line 93, in get_llm
      File "/opt/heavyai/heavyiq/.venv/lib/python3.10/site-packages/cachetools/__init__.py", line 741, in wrapper
        v = func(*args, **kwargs)
      File "<frozen heavyiq.langchain.llms>", line 100, in get_llm_by_type
      File "<frozen heavyiq.langchain.llms>", line 190, in get_openai_llm_by_model_name
      File "<frozen heavyiq.langchain.llms>", line 222, in _get_openai_chat_llm
      File "/opt/heavyai/heavyiq/.venv/lib/python3.10/site-packages/langchain_core/load/serializable.py", line 120, in __init__
        super().__init__(**kwargs)
      File "/opt/heavyai/heavyiq/.venv/lib/python3.10/site-packages/pydantic/v1/main.py", line 341, in __init__
        raise validation_error
    pydantic.v1.error_wrappers.ValidationError: 1 validation error for ChatOpenAI
    __root__
      Did not find openai_api_key, please add an environment variable `OPENAI_API_KEY` which contains it, or pass `openai_api_key` as a named parameter. (type=value_error)
    
    127.0.0.1:53372 - "POST /api/v1/lcel/auto/query HTTP/1.1" 500

     

    0
    Comment actions Permalink
  • Avatar
    Gianfranco Campana

    This error keeps reoccurring with the frequence of one time every 1 to 3 days.

    The only thing I can do is downgrading to 7.x.x version, and upgrading again and recreating the container to 8.1.0 (and don't forgive every time to recreate the ingesting folder in whitelist).

    Quite annoying, and periodic downtime, that leads to missing some data updates.

     

     

     

     

     

     

     

    0
    Comment actions Permalink

Please sign in to leave a comment.