Quantcast
Channel: New Topics on Talend Community Forum
Viewing all 2816 articles
Browse latest View live

[resolved] tREST with variable in body - by frederick.kirwin

$
0
0

Got it figured out!  Here is how you do it.  This is a good solution for people who find that XML map or JSON schema are not generating per API standard. [font=Verdana, Helvetica, Arial, sans-serif][size=1]"{[/size][/font] [font=Verdana, Helvetica, Arial, sans-serif][size=1]   \"action\":\"createOnly\",[/size][/font] [font=Verdana, Helvetica, Arial, sans-serif][size=1]   \"lookupField\":\"email\",[/size][/font] [font=Verdana, Helvetica, Arial, sans-serif][size=1]   \"partitionName\":\"Swyft\",[/size][/font] [font=Verdana, Helvetica, Arial, sans-serif][size=1]   \"input\":"[/size][/font] [font=Verdana, Helvetica, Arial, sans-serif][size=1] +((String)globalMap.get("row3.Payload"))+"[/size][/font] [font=Verdana, Helvetica, Arial, sans-serif][size=1]}"[/size][/font]


Difference between Standard jobs and Big Data jobs in Talend - by tui_gmp

$
0
0

Hi Folks, Am new to Talend Big data edition, so trying to understand the difference between standard jobs and Big data jobs from a Extraction, Processing and Loading perspective. Please confirm if the below understanding is correct: 1. Below is a standard job to extract 20 millions records from one Impala table -> Clenase and -> Load into another Impala table Extraction -> Happens querying the Hadoop cluster (Impala query processing) into Talend server Processing -> The cleansing on the 20 million rows happen in Talend server Loading -> The processed records are then inserted into Hadoop cluster (Bulk inserts)

2. Below is a big data job to  extract 20 millions records from one Impala table with a lookup -> Cleanse -> Load into another Impala table. Extraction -> Happens querying the Hadoop cluster (Impala query processing) into Talend server Processing -> The cleansing on the 20 million rows happen in Hadoop map reduce job and no data comes to Talend server Loading -> The processed records are then inserted into Hadoop cluster (Bulk inserts)

3. The biggest difference in using processing components in Standard job and Big data job is that the data comes to Talend server for processing in a standard job and does not come to Talend server in a big data job. Regards, Vas

Difference between Standard jobs and Big Data jobs in Talend - by tui_gmp

$
0
0

Ignore this thread as a separate topic is created with the image. Please respond to that.

[resolved] Help on tOracleBulkExec-cannot run program "sqlldr" - by sophiez16

$
0
0

Hi Shong, I am using Talend 5.6.1.  I got the following error when I use tOracleOutputBulkExec: ------------------------------------------------------------- Exception in component tOracleOutputBulkExec_1_tOBE java.io.IOException: Cannot run program "sqlldr": CreateProcess error=2, The system cannot find the file specified at java.lang.ProcessBuilder.start(Unknown Source) at java.lang.Runtime.exec(Unknown Source)... ------------------------------------------------------------ I have Oracle11g client installed on my Windows machine and the following environment variables are set: ORACLE_HOME: C:\Oracle\product\11.2.0\client_1 TNS_ADMIN: C:\Oracle\product\11.2.0\client_1\network\admin PATH contains:  C:\Oracle\product\11.2.0\client_1\bin\ I have added a service name to the C:\Oracle\product\11.2.0\client_1\network\admin\tnsnames.ora Could you please help me to figure out what has gone wrong? Thank you, Sophie

SSL connectivity to MQ Server from client - by Vladimirovich

$
0
0

Hi! Always concentrate your attention on "cause" block: [size=1][font=Verdana, Helvetica, Arial, sans-serif]Caused by: javax.net.ssl.SSLHandshakeException: No appropriate protocol (protocol is disabled or cipher suites are inappropriate)[/font][/size] http://stackoverflow.com/questions/28236091/how-to-enable-ssl-3-in-java Also check General Java setting: Control Panel ->Java Good Luck.

tTeradataTPTUtility - Anyone know how to make it work? - by amit2676

$
0
0

[quote from =Dear,I am also trying  tFileInputDelimited > tTeradataTPTUtilitycan you please let me know step by step or screen shots how did you achive?help will be  highly appretiatedThanks...Amit talendtester]I am trying to get tTeradataTPTUtility to work. I have a simple comma delimited file with 12 rows. My job looks like tFileInputDelimited >tTeradataTPTUtility The TalendTesterror.log says "'tbuild' is not recognized as an internal or external command, operable program or batch file" I installed the drivers for this: Product: Teradata Parallel Transporter Load Operator for Windows Version: 13.10.00.04 But Talend still doesn't find tbuild, what else can I try?

How to Close a Job in an infinite loop? - by Vladimirovich

$
0
0

Hi You have got answer already in prevous post:

[quote from ]In run if condition, use global variable tRunJob_1_CHILD_RETURN_CODE to test return code. 0, it's OK, 1 it's failed.
Try to use this process: tLoop([size=1][font=Verdana, Helvetica, Arial, sans-serif] [/font][/size][size=1][font=Verdana, Helvetica, Arial, sans-serif]tRunJob_1_CHILD_RETURN_CODE condition[/font][/size])->TRunJob1 (Die on error flag should be disable) I have using this working scenario for few years. Good Luck.

tWriteXMLField : How to use it ? - by benji

$
0
0

Hello, I encounter exactly the same issue as rsouris, any workaround to propose ? Could you please help ? Thanks


Data from header into column - by djdeejay_offline

$
0
0

Hello mdelaj assume your row1 from tfiledelimitedinput1 has the Schema: Name    City     Phone_number   Country   if you switch off the Header ( set it to 0) in your tfiledelimitedinput1 then you should get as first row1 your date in row1.name place a tjavarow between tfiledelimitedinput1 and tmap1, open the component view click on create code, you will get 4 lines with code output.Name = Input.Name and so on.. add this  lines at the end:  if ( Relational.ISNULL( City)  && Relational.ISNULL(Phone_number) && Relational.ISNULL(Country)) globalMap.set("myDate",(Date)Input.Name); // (you might Need to convert here first some if ist not really a date) in your tMap add the column date in your output row. as expresssion for this row you enter: (Date)globalMap.get("mydate") HTH DJ

tteradatarow component - by jlolling

$
0
0

The component expects the [code]tbuild[/code] program in the path. The current version also does not die if something is wrong with the generated script. Please do following: Connect a tTerdataTPTUltility with a Run If trigger with a tDie component.

Spend the tDie a meaningful message like (assuming the component unique name is tTeradataTPTUtility_1 - otherwise adjust the number) [code]"Load failed with exitcode: " + ((Integer)globalMap.get("tTeradataTPTUtility_1_EXIT_VALUE"))[/code] This way you will see if the process it self works correctly. If this works you should check the log files or log tables.

Error due to older Groovy version - by dbh

$
0
0

This topic lives on at https://jira.talendforge.org/browse/TDI-33821 Now, there are incompatibilities between the Ancient Groovy 1.0 (Used on TOS DI 6.1.x) and Java 1.8. If interested, please see the JIRA which has my attached changes to tGroovy to be compatible with Java 1.8 and upgrade Groovy itself. I believe the right approach to use this is as follows: * make a custom component out of the modified tGroovy * Switch jobs to use the revised version * Talend to integrate the fix and Groovy version upgrade * Upon that fix being included in the talend release, switch Jobs back over to using the official talend component

Problem with Expression and Excel file input - by dpringenbach

$
0
0

 I am new to Talend, so thanks for the help.  I was able to stack them and have multiple checks.   But there are many different values in the column for character length ($20, $200, $3 etc) whatever the length of the data in that row.  I want to use the first character in the object and compare it to"$", then write the new/different character format to the output.   I'm looking at substring but i think i have the same issue of comparing a string to an object.  Any direction I should take?

Get MAX(ID) from tMysqlInput - job error - by seeusoon

$
0
0

Hi djdeejay_offline, Thank you for your reply. I added the backticks but it didn't change anything.

Create Array of JSON Objects via tRESTCLient for Marketo Custom Object - by frederick.kirwin

$
0
0

I just went through this same exact issue when trying to create leads.  I found a great solution though and a solution I will probably reuse quite a bit.  Please excuse me if I can't remember all of the details as I don't have me work computer.  Also some of these steps may not be needed for you.  Let me know if you need more help.  I can take screenshots when I am at my work computer. Here is what I did. tfilelist to loop through directory-->Flat File with new leads-->tbufferinput--->tbufferoutput --->Tmap -->writetoJSON-->titeratetoflow-->TREST. REasoning: The flat file contains the data which is picked up in the filelist. User the buffer in case there is only one record in the set.  May not be a big deal for you, but out leads are being accessed without any standards and may come in batches of 1-300.  This component is optional if you aren't scared of this phenomenon. Tmap from source schema to write to JSON schema.  The write to JSON component should have one output being the payload.  Use the configure the JSON tree to accept the fields as subfields.  Map the rows under the root element which can be named any arbitrary thing.  Make what could be considered a primary key (in my case email address) as the loop element. Iterate to flow will scoop up the payload with a key. For TREST pass your authentication credentials through in the header.  In the body use this (for create lead anyway): "{    \"action\":\"createOnly\",    \"lookupField\":\"email\",    \"partitionName\":\"Swyft\",    \"input\":"     +((String)globalMap.get("row3.Payload"))+" }"

|Talend ESB] POST Method not working for a web service - by junmilsso

$
0
0

Have you tried changing the Content and Accept Types? Try changing XML with other options in their drop down boxes and see if a combination works.


put into amazon s3 repeatedly - by dteng

$
0
0

Hi There, Hope someone can give me some pointer. I have a file directory containing thousands of files. I have a csv file that contains the name of the files that I need to upload to S3. This csv file changes on regular basis. I tried to read the csv file using FileInputDelimited, and use tFlowToIterate where I specify "fileName" key and the value pointing to the cell in the csv file, and then iterate to tS3Put  FileInputDelimited --(Connect)-- tFlowIterate --(Iterate)--tSPut On tS3Put's File, I enter the file location path and ((String)globalMap.get("fileName")) When I run it, i got a Null (repeatedly) as if the key is not getting the value. Do I set the job design and component correctly? What mistake that I did? Thanks for your help  in advance!

Connect Talend to Hive in Hortonworks Sandbox - by Jegan

$
0
0

Hi Even am getting same issue. My Hive is running under Tez engine framework and when i run the job getting below issue any solution for this. Exception in component tHiveConnection_1 org.apache.hive.service.cli.HiveSQLException: Error while processing statement: Cannot modify mapreduce.framework.name at runtime. It is not in list of params that are allowed to be modified at runtime at org.apache.hive.jdbc.Utils.verifySuccess(Utils.java:231) at org.apache.hive.jdbc.Utils.verifySuccessWithInfo(Utils.java:217) Your help will be really appreiciable

convert the unix timestamp - by Shri_Kul1

$
0
0

Hi All, 1 general Question, What we could plan at data warehouse side when we need to delete bunch of records.? Except Backup, is one them.  - What are Best Practice ? - Operations on DWH before start delete?

Tmap null pointer exception - by bipinkumarcse

$
0
0

[quote from =jlolling]Simply check which input column is nullable and is assigned to a none-nullable output column. These are potential assignments causing a NullPointerException.
my all input columns and all output columns are nullable . i have used MD5 to create hash value and used multiple columns to generate this. can this cause to this exception . how to find out the root cause of this exception. like which columns or what data is leading to this so that i can handle that value.

Not able to download the zipped folder from exchange site - by shruti04mittal

$
0
0

Hi, I am trying to download the component tRedirectOutput, but I am not getting any zipped file instead getting a file like below- Please help me what might be wrong here.

Viewing all 2816 articles
Browse latest View live