Connector Postgres



  • Avatar
    Candido Dessanti

    Hi @Ivan1981,

    you can connect to Postgres using HeavyConnect, but I guess the feature is for EE Only (I'm not sure of that)

    You can find references to that feature here

    Postgres is the first database added, but we are adding some more (e.g. SnowFlake, Hive, etc.)

    The feature should be in Beta.

    Best Regards, Candido

    Comment actions Permalink
  • Avatar
    Иван Л

    Hey! There is not a single connection announced in version 6.4.0 in UI execution (.

    Should I wait? And can I find out the priority directions for the development of the platform?

    Comment actions Permalink
  • Avatar
    Candido Dessanti


    You can use the heavy connector thru Heavy Immerse, but you have to install a driver for the database you are interested in (e.g., Postgres, but also Snowflake and RedShift have been testing).

    Please refer to the heavy connect docs and pay special attention to setting up a heavy connector for immerse's section.

    After that, everything is correctly configured, you should see something like that in the Data Manager section of Immerse (I added just PostgreSQL)

    Then you can select PostgreSQL as datasource to Connect (create a foreign table thru the UI) or Import (copy the data to a local table) 

    Pressing connect, you should get a preview , and you can choose the name of the foreign table you are going to create

    More or less, the same happens when you choose the Import option.

    After creating the table, you can query or inspect it in the SQL Editor. Let

    Let me know if we replied to your question or if you need other info.

    Comment actions Permalink
  • Avatar
    Иван Л

    odbcinst.ini and odbc.ini in the /etc/ directory.   it"s  OK

    docker file it"s true?  I don"t see postgresql


    # Copy and extract HEAVY.AI tarball. In own stage so that the temporary tarball
    # isn't included in a layer.
    FROM ubuntu:18.04 AS extract

    WORKDIR /opt/heavyai/
    COPY heavyai-latest-Linux-x86_64-cpu.tar.gz /opt/heavyai/
    RUN tar xvf heavyai-latest-Linux-x86_64-cpu.tar.gz --strip-components=1 && \
    rm -rf heavyai-latest-Linux-x86_64-cpu.tar.gz

    # Build final stage
    FROM ubuntu:18.04
    LABEL maintainer "HEAVY.AI Support <>"

    RUN apt-get update && apt-get install -y --no-install-recommends \
    libldap-2.4-2 \
    bsdmainutils \
    wget \
    curl \
    libgeos-dev \
    default-jre-headless && \
    apt-get remove --purge -y && \
    rm -rf /var/lib/apt/lists/*

    COPY --from=extract /opt/heavyai /opt/heavyai

    WORKDIR /opt/heavyai

    EXPOSE 6274 6273

    CMD /opt/heavyai/startheavy --non-interactive --data /var/lib/heavyai/storage --config /var/lib/heavyai/heavy.conf

    FROM heavyai/heavyai-ee-cuda

    # Install PostGreSQL ODBC driver.
    # The Snowflake ODBC driver depends on unixodbc.
    RUN apt-get update && apt-get install -y odbc-postgresql unixodbc

    # Install Redshift ODBC driver.
    RUN dpkg -i ./AmazonRedshiftODBC-64-bit-
    RUN rm ./AmazonRedshiftODBC-64-bit-

    # Install Snowflake ODBC driver.
    RUN dpkg -i ./snowflake-odbc-2.25.2.x86_64.deb
    RUN rm ./snowflake-odbc-2.25.2.x86_64.deb
    Comment actions Permalink
  • Avatar
    Иван Л

    I understood that the instructions for UBUNTU , I have docker in mac os

    I can get settings docker file for mac os?

    Comment actions Permalink

Please sign in to leave a comment.