Rstudio IDE

How can we improve Rstudio IDE in IBM | Data Scientist Workbench?

(thinking…)

Enter your idea and we'll search to see if someone has already suggested it.

If a similar idea already exists, you can support and comment on it.

If it doesn't exist, you can post your idea so others can support it.

Enter your idea and we'll search to see if someone has already suggested it.

  1. problem with table function

    Error: class(objId) == "jobj" is not TRUE

    1 vote
    Sign in
    Check!
    (thinking…)
    Reset
    or sign in with
    • sso
    • facebook
    • google
      Password icon
      Signed in as (Sign out)

      We’ll send you updates on this idea

      2 comments  ·  Flag idea as inappropriate…  ·  Admin →
    • Problem in Uploading CSV File: Shared Object 'readr.so' Not found

      I am facing error while uploading CSV file, please help with the subject error

      3 votes
      Sign in
      Check!
      (thinking…)
      Reset
      or sign in with
      • sso
      • facebook
      • google
        Password icon
        Signed in as (Sign out)

        We’ll send you updates on this idea

        1 comment  ·  Flag idea as inappropriate…  ·  Admin →
      • Error installing Rstan

        I am trying to install "rstan" package from within rstudio. I got the error message saying "/usr/bin/ld: cannot find -lgfortran
        collect2: error: ld returned 1 exit status ..."

        This is probably because I do not have gfortran and its libraries installed. I know how to do this on my local Linux machine but don't know how to do it with the DSW cloud environment.

        Thanks.

        Best,
        Shige

        1 vote
        Sign in
        Check!
        (thinking…)
        Reset
        or sign in with
        • sso
        • facebook
        • google
          Password icon
          Signed in as (Sign out)

          We’ll send you updates on this idea

          2 comments  ·  Flag idea as inappropriate…  ·  Admin →
        • Establishing a connection to dashDB using RODBC is not working.

          When trying to set up a connection to dashDB from the rstudio ide on DSX I get this error:

          [RODBC] ERROR: state IM002, code 0, message [unixODBC][Driver Manager]Data source name not found, and no default driver specified

          This is after filling out this:

          dsn_driver <- "{IBM DB2 ODBC Driver}"
          dsn_database <- "BLUDB" # e.g. "BLUDB"
          dsn_hostname <- "<Enter Hostname>" # e.g.: "awh-yp-small03.services.dal.bluemix.net"
          dsn_port <- "50000" # e.g. "50000"
          dsn_protocol <- "TCPIP" # i.e. "TCPIP"
          dsn_uid <- "<Enter UserID>" # e.g. "dash104434"
          dsn_pwd <- "<Enter Password>" # e.g. "7dBZ39xN6$o0JiX!m"

          conn_path <- paste("DRIVER=",dsn_driver,
          ";DATABASE=",dsn_database,
          ";HOSTNAME=",dsn_hostname,
          ";PORT=",dsn_port,
          ";PROTOCOL=",dsn_protocol,
          ";UID=",dsn_uid,
          ";PWD=",dsn_pwd,sep="")
          conn <- odbcDriverConnect(conn_path) …

          2 votes
          Sign in
          Check!
          (thinking…)
          Reset
          or sign in with
          • sso
          • facebook
          • google
            Password icon
            Signed in as (Sign out)

            We’ll send you updates on this idea

            1 comment  ·  Flag idea as inappropriate…  ·  Admin →
          • Trying to use SparkR but createDataFrame(sqlContext, data) fails

            I get the following error:
            Error in invokeJava(isStatic = FALSE, objId$id, methodName, ...) :
            org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage 0.0 (TID 0, localhost): java.lang.OutOfMemoryError: Java heap space
            at java.util.Arrays.copyOf(Arrays.java:3236)
            at java.io.ByteArrayOutputStream.grow(ByteArrayOutputStream.java:118)
            at java.io.ByteArrayOutputStream.ensureCapacity(ByteArrayOutputStream.java:93)
            at java.io.ByteArrayOutputStream.write(ByteArrayOutputStream.java:153)
            at java.io.ObjectOutputStream$BlockDataOutputStream.drain(ObjectOutputStream.java:1877)
            at java.io.ObjectOutputStream$BlockDataOutputStream.setBlockDataMode(ObjectOutputStream.java:1786)
            at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1189)
            at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:348)
            at org.apache.spark.serializer.JavaSerializationStream.writeObject(JavaSerializer.scala:44)
            at org.apache.spark.serializer.JavaSerializerInstance.serialize(JavaSerialize

            1 vote
            Sign in
            Check!
            (thinking…)
            Reset
            or sign in with
            • sso
            • facebook
            • google
              Password icon
              Signed in as (Sign out)

              We’ll send you updates on this idea

              0 comments  ·  Flag idea as inappropriate…  ·  Admin →
            • 2 votes
              Sign in
              Check!
              (thinking…)
              Reset
              or sign in with
              • sso
              • facebook
              • google
                Password icon
                Signed in as (Sign out)

                We’ll send you updates on this idea

                0 comments  ·  Flag idea as inappropriate…  ·  Admin →
              • 1 vote
                Sign in
                Check!
                (thinking…)
                Reset
                or sign in with
                • sso
                • facebook
                • google
                  Password icon
                  Signed in as (Sign out)

                  We’ll send you updates on this idea

                  1 comment  ·  Flag idea as inappropriate…  ·  Admin →
                • Don't see your idea?

                Feedback and Knowledge Base