Frank Knot

Forum Replies Created

Viewing 11 posts - 346 through 356 (of 356 total)
  • Author
    Posts
  • in reply to: Environment setting for commandtimeout #23860
    Mark
    Keymaster

    Hi James,

    F_JD21AE is an error when HVR tries to establish a connection to the database but doesn’t succeed with the default (or specified) timeout period. The connection to HVR that wants to connect to DB2i succeeds (which may be an agent running on a different machine), but not the actual connection to the database.

    F_JD22D3 is an error when HVR submitted some SQL, is expecting a result, but doesn’t retrieve the result in the default (or specified) timeout period.

    The environment variable HVR_TCP_KEEPALIVE changes the frequency of TCP keep-alive messages sent between HVR installations. This is often useful if there are timeouts on the network causing connections to be terminated if there is otherwise no communication on the wire, for example if a large volume of data was staged and HVR submitted a command to bulk load this into the destination database and we are simply waiting for the bulk load to return whether data was loaded successfully. It does not appear that in your environment there is an issue like that.

    SQL_PREPARE_TIMEOUT defines how much time we allow for a SQL statement to respond after we submitted it. The unit is in seconds, and the default is 60. For the second error HVR reports it could be that setting this variable to a higher value will help, but one would think your database should have responded within 60 seconds.

    HVR_ODBC_CONNECT_STRING_ADD is useful to provide extra instructions to the ODBC connection that may be driver-specific.

    Hope this clarifies.

    Mark.

    SMarella
    Participant

    Thanks Mark

    in reply to: Community Discussions #23946

    Hi Guys,
    How can we retrieve clustered data from SAP database into oracle database using SAPXFORM
    Do we have to enable any permissions from the source or do we have to configure anything in the HVR

    in reply to: How to retrieve clustered data from SAP #23947
    Mark
    Keymaster

    For cluster tables HVR will still perform log-based CDC like it does for transparent tables. With that HVR picks up the encoded data in its binary format. The SAPXForm will then, upon integrating the changes, decode the data into the definition that the SAP application uses. To enable this you must pick up the cluster table definition from the SAP dictionaries during table explore.

    A quikstart for the definition of SAPXForm is in our documentation: https://www.hvr-software.com/docs/quick-start-guides/quick-start-for-hvr-sapxform

    Please do note that SAPXForm is separately licensed (and requires a separate license to be enabled).

    Hope this helps.https://www.hvr-software.com/docs/quick-start-guides/quick-start-for-hvr-sapxform

    Mark.

    in reply to: Community Discussions #24410

    Hi Team,
    I want to load a CSV file into the HVRclient GUI from my local machine. The HVRHUB is installed in a separate Server
    Can you please help me with the steps on how to proceed

    Thanks
    Mohammad Younus

    in reply to: how to get my local files into hvr client gui? #24413
    Mark
    Keymaster

    Hi Mohammed,

    HVR provides data replication capabilities between data stores, typically databases (although data can be read from and written to csv files as well).

    Some of our customers use HVR to move files between servers. More commonly data is passed between databases or between databases and file systems.

    Note the HVR GUI is used to configure the setup and monitor jobs. It does not provide the ability to inspect data or view query results.

    You could use HVR to move the data on your PC to a server or even into a database by using file to file replication (see https://www.hvr-software.com/docs/quick-start-guides/quick-start-for-hvr-file-replication), or file to database replication if you have table definitions for your files. However the latter use case may be a lot easier with an open source technology like Talend Open Studio.

    Hope this helps.

    Mark.

    in reply to: Community Discussions #24414

    plan to move our data from tables to Azure blob File storage In flat files.
    A few confirmation points needed :
    • Does it need a separate license
    • Is it a must to set up a Hadoop environment
    • And does the output files on Azure Blob FS come on all sorts of flat files or just in CSV, JSON, and AVRO formats?

    Thanks,
    Mohammad Younus

    in reply to: HVR FOR AZURE BLOB FS #24415
    Mark
    Keymaster

    Hi Mohammed,

    The requirements to write to Azure Blob storage are here: https://www.hvr-software.com/docs/location-class-requirements/requirements-for-azure-blob-fs

    You can see there that a Hadoop client is required because HVR uses WebHDFS REST calls to deliver data into Blob Storage.

    Whether you need a separate license depends on the kind of license you have. Please contact your HVR account manager for this.

    HVR supports file formats Parquet, JSON, Avro, CSV or XML (https://www.hvr-software.com/docs/actions/fileformat). If you want something else then you would deliver to one of these formats and transform the data using an AgentPlugin.

    Hope this helps.

    Mark.

    in reply to: Community Discussions #24502

    Hi,

    While replicating from oracle to sqlDWH the datatypes for numeric are automatically converted into numeric with precision. And I don’t need the precision.
    The data type for columns in oracle is numeric but when replicated to SQLDWH it comes as numeric(38,4) with 4 precision.
    So is there a way to change the datatype for all these columns for all the tables who have datatypes like numeric(38,4) to just numeric without the precision.

    Thanks,
    Mohammad Yunus

    in reply to: How to change datatypes of specific columns for all tables #24504
    Mark
    Keymaster

    Mohammed,

    In HVR you can use the action ColumnProperties to modify the data types for columns. To do this for all columns with a specific data type you can use the action ColumnProperties with argument /DataTypeMatch. You can use this with number as the source and options for scale/precision. Please see https://www.hvr-software.com/docs/actions/columnproperties#ColumnProperties-Parameters for the details. Then pick the target data type you want and any scale/precision. Note that the DDL HVR generates must be able to run on the target – SQLDWH in your case.

    Hope this helps.

    Mark.

    in reply to: Environment setting for commandtimeout #24640
    Dhaneshwari.Kumari
    Participant

    Hi Mark,

    I am also facing same issue “F_JD22D3 DBMS error [Communication link failure. comm rc=8413 – CWBCO1054 – A user-specified time-out occurred while sending or receiving data]. DBMS error number [8413]. SQL State [08S01]. DBMS version [*******] Last query [‘*******’]”.

    I have tried using SQL_PREPARE_TIMEOUT=30000 as environment variable. In general very high value. But, no luck.

    Can you help to know , what is value for this HVR_ODBC_CONNECT_STRING_ADD?
    How can we add SQL_PREPARE_TIMEOUT to it?

    What is purpose of QUERYTIMEOUT ? Does this also need to be add in HVR_ODBC_CONNECT_STRING_ADD? If yes, what is recommended value for it?

    Regards
    Dhaneshwari

Viewing 11 posts - 346 through 356 (of 356 total)
Test drive
Contact us