Got it figured out! ÃÂ Here is how you do it. ÃÂ This is a good solution for people who find that XML map or JSON schema are not generating per API standard. [font=Verdana, Helvetica, Arial, sans-serif][size=1]"{[/size][/font] [font=Verdana, Helvetica, Arial, sans-serif][size=1]ÃÂ ÃÂ \"action\":\"createOnly\",[/size][/font] [font=Verdana, Helvetica, Arial, sans-serif][size=1]ÃÂ ÃÂ \"lookupField\":\"email\",[/size][/font] [font=Verdana, Helvetica, Arial, sans-serif][size=1]ÃÂ ÃÂ \"partitionName\":\"Swyft\",[/size][/font] [font=Verdana, Helvetica, Arial, sans-serif][size=1]ÃÂ ÃÂ \"input\":"[/size][/font] [font=Verdana, Helvetica, Arial, sans-serif][size=1] +((String)globalMap.get("row3.Payload"))+"[/size][/font] [font=Verdana, Helvetica, Arial, sans-serif][size=1]}"[/size][/font]
[resolved] tREST with variable in body - by frederick.kirwin
Difference between Standard jobs and Big Data jobs in Talend - by tui_gmp
Hi Folks,
Am new to Talend Big data edition, so trying to understand the difference between standard jobs and Big data jobs from a Extraction, Processing and Loading perspective. Please confirm if the below understanding is correct:
1. Below is a standard job to extract 20 millions records from one Impala table -> Clenase and -> Load into another Impala table
Extraction -> Happens querying the Hadoop cluster (Impala query processing) into Talend server
Processing -> The cleansing on the 20 million rows happen in Talend server
Loading -> The processed records are then inserted into Hadoop cluster (Bulk inserts)
2. Below is a big data job to ÃÂ extract 20 millions records from one Impala table with a lookup -> Cleanse -> Load into another Impala table.
Extraction -> Happens querying the Hadoop cluster (Impala query processing) into Talend server
Processing -> The cleansing on the 20 million rows happen in Hadoop map reduce job and no data comes to Talend server
Loading -> The processed records are then inserted into Hadoop cluster (Bulk inserts)
3. The biggest difference in using processing components in Standard job and Big data job is that the data comes to Talend server for processing in a standard job and does not come to Talend server in a big data job.
Regards,
Vas
Difference between Standard jobs and Big Data jobs in Talend - by tui_gmp
Ignore this thread as a separate topic is created with the image. Please respond to that.
[resolved] Help on tOracleBulkExec-cannot run program "sqlldr" - by sophiez16
Hi Shong, I am using Talend 5.6.1.ÃÂ I got the following error when I use tOracleOutputBulkExec: ------------------------------------------------------------- Exception in component tOracleOutputBulkExec_1_tOBE java.io.IOException: Cannot run program "sqlldr": CreateProcess error=2, The system cannot find the file specified at java.lang.ProcessBuilder.start(Unknown Source) at java.lang.Runtime.exec(Unknown Source)... ------------------------------------------------------------ I have Oracle11g client installed on my Windows machine and the following environment variables are set: ORACLE_HOME: C:\Oracle\product\11.2.0\client_1 TNS_ADMIN: C:\Oracle\product\11.2.0\client_1\network\admin PATH contains:ÃÂ C:\Oracle\product\11.2.0\client_1\bin\ I have added a service name to the C:\Oracle\product\11.2.0\client_1\network\admin\tnsnames.ora Could you please help me to figure out what has gone wrong? Thank you, Sophie
SSL connectivity to MQ Server from client - by Vladimirovich
Hi! Always concentrate your attention on "cause" block: [size=1][font=Verdana, Helvetica, Arial, sans-serif]Caused by: javax.net.ssl.SSLHandshakeException: No appropriate protocol (protocol is disabled or cipher suites are inappropriate)[/font][/size] http://stackoverflow.com/questions/28236091/how-to-enable-ssl-3-in-java Also check General Java setting: Control Panel ->Java Good Luck.
tTeradataTPTUtility - Anyone know how to make it work? - by amit2676
[quote from =Dear,I am also tryingÃÂ ÃÂ tFileInputDelimited >ÃÂ tTeradataTPTUtilitycan you please let me know step by step or screen shots how did you achive?help will be ÃÂ highly appretiatedThanks...Amit talendtester]I am trying to get tTeradataTPTUtility to work. I have a simple comma delimited file with 12 rows. My job looks like tFileInputDelimited >tTeradataTPTUtility The TalendTesterror.log says "'tbuild' is not recognized as an internal or external command, operable program or batch file" I installed the drivers for this: Product: Teradata Parallel Transporter Load Operator for Windows Version: 13.10.00.04 But Talend still doesn't find tbuild, what else can I try?
How to Close a Job in an infinite loop? - by Vladimirovich
Hi You have got answer already in prevous post:
[quote from ]In run if condition, use global variable tRunJob_1_CHILD_RETURN_CODE to test return code. 0, it's OK, 1 it's failed.Try to use this process: tLoop([size=1][font=Verdana, Helvetica, Arial, sans-serif]ÃÂ [/font][/size][size=1][font=Verdana, Helvetica, Arial, sans-serif]tRunJob_1_CHILD_RETURN_CODE condition[/font][/size])->TRunJob1 (Die on error flag should be disable) I have using this working scenario for few years. Good Luck.
tWriteXMLField : How to use it ? - by benji
Hello, I encounter exactly the same issue as rsouris, any workaround to propose ? Could you please help ? Thanks
Data from header into column - by djdeejay_offline
Hello mdelaj assume your row1 from tfiledelimitedinput1 has the Schema: Name ÃÂ ÃÂ City ÃÂ ÃÂ Phone_number ÃÂ CountryÃÂ ÃÂ if you switch off the Header ( set it to 0) in your tfiledelimitedinput1 then you should get as first row1 your date in row1.name place a tjavarow between tfiledelimitedinput1 and tmap1, open the component view click on create code, you will get 4 lines with code output.Name = Input.Name and so on.. addÃÂ this ÃÂ lines at the end:ÃÂ if ( Relational.ISNULL(ÃÂ City)ÃÂ ÃÂ && Relational.ISNULL(Phone_number) && Relational.ISNULL(Country))ÃÂ globalMap.set("myDate",(Date)Input.Name); // (you might Need to convert here first some if ist not really a date) in your tMap add the column date in your output row. as expresssion for this row you enter: (Date)globalMap.get("mydate") HTH DJ
tteradatarow component - by jlolling
The component expects the
[code]tbuild[/code]
program in the path. The current version also does not die if something is wrong with the generated script.
Please do following:
Connect a tTerdataTPTUltility with a Run If trigger with a tDie component.
Spend the tDie a meaningful message like (assuming the component unique name is tTeradataTPTUtility_1 - otherwise adjust the number)
[code]"Load failed with exitcode: " +ÃÂ ((Integer)globalMap.get("tTeradataTPTUtility_1_EXIT_VALUE"))[/code]
This way you will see if the process it self works correctly. If this works you should check the log files or log tables.
Error due to older Groovy version - by dbh
This topic lives on at https://jira.talendforge.org/browse/TDI-33821 Now, there are incompatibilities between the Ancient Groovy 1.0 (Used on TOS DI 6.1.x) and Java 1.8. If interested, please see the JIRA which has my attached changes to tGroovy to be compatible with Java 1.8 and upgrade Groovy itself. I believe the right approach to use this is as follows: * make a custom component out of the modified tGroovy * Switch jobs to use the revised version * Talend to integrate the fix and Groovy version upgrade * Upon that fix being included in the talend release, switch Jobs back over to using the official talend component
Problem with Expression and Excel file input - by dpringenbach
ÃÂ I am new to Talend, so thanks for the help. ÃÂ I was able to stack them and have multiple checks. ÃÂ But there are many different values in the column for character length ($20, $200, $3 etc) whatever the length of the data in that row. ÃÂ I want to use the first character in the object and compare it to"$", then write the new/different character format to the output. ÃÂ I'm looking at substring but i think i have the same issue of comparing a string to an object. ÃÂ Any direction I should take?
Get MAX(ID) from tMysqlInput - job error - by seeusoon
Hi djdeejay_offline,
Thank you for your reply.
I added the backticks but it didn't change anything.
Create Array of JSON Objects via tRESTCLient for Marketo Custom Object - by frederick.kirwin
I just went through this same exact issue when trying to create leads. ÃÂ I found a great solution though and a solution I will probably reuse quite a bit. ÃÂ Please excuse me if I can't remember all of the details as I don't have me work computer. ÃÂ Also some of these steps may not be needed for you. ÃÂ Let me know if you need more help. ÃÂ I can take screenshots when I am at my work computer. Here is what I did. tfilelist to loop through directory-->Flat File with new leads-->tbufferinput--->tbufferoutput --->Tmap -->writetoJSON-->titeratetoflow-->TREST. REasoning: The flat file contains the data which is picked up in the filelist. User the buffer in case there is only one record in the set. ÃÂ May not be a big deal for you, but out leads are being accessed without any standards and may come in batches of 1-300. ÃÂ This component is optional if you aren't scared of this phenomenon. Tmap from source schema to write to JSON schema. ÃÂ The write to JSON component should have one output being the payload. ÃÂ Use the configure the JSON tree to accept the fields as subfields. ÃÂ Map the rows under the root element which can be named any arbitrary thing. ÃÂ Make what could be considered a primary key (in my case email address) as the loop element. Iterate to flow will scoop up the payload with a key. For TREST pass your authentication credentials through in the header. ÃÂ In the body use this (for create lead anyway): "{ ÃÂ ÃÂ \"action\":\"createOnly\", ÃÂ ÃÂ \"lookupField\":\"email\", ÃÂ ÃÂ \"partitionName\":\"Swyft\", ÃÂ ÃÂ \"input\":" ÃÂ ÃÂ +((String)globalMap.get("row3.Payload"))+" }"
|Talend ESB] POST Method not working for a web service - by junmilsso
Have you tried changing the Content and Accept Types? Try changing XML with other options in their drop down boxes and see if a combination works.
put into amazon s3 repeatedly - by dteng
Hi There, Hope someone can give me some pointer. I have a file directory containing thousands of files. I have a csv file that contains the name of the files that I need to upload to S3. This csv file changes on regular basis. I tried to read the csv file using FileInputDelimited, and use tFlowToIterate where I specify "fileName" key and the value pointing to the cell in the csv file, and then iterate to tS3PutÃÂ FileInputDelimited --(Connect)-- tFlowIterate --(Iterate)--tSPut On tS3Put's File, I enter the file location path andÃÂ ((String)globalMap.get("fileName")) When I run it, i got a Null (repeatedly) as if the key is not getting the value. Do I set the job design and component correctly? What mistake that I did? Thanks for your help ÃÂ in advance!
Connect Talend to Hive in Hortonworks Sandbox - by Jegan
Hi Even am getting same issue. My Hive is running under Tez engine framework and when i run the job getting below issue any solution for this. Exception in component tHiveConnection_1 org.apache.hive.service.cli.HiveSQLException: Error while processing statement: Cannot modify mapreduce.framework.name at runtime. It is not in list of params that are allowed to be modified at runtime at org.apache.hive.jdbc.Utils.verifySuccess(Utils.java:231) at org.apache.hive.jdbc.Utils.verifySuccessWithInfo(Utils.java:217) Your help will be really appreiciable
convert the unix timestamp - by Shri_Kul1
Hi All, 1 general Question, What we could plan at data warehouse side when we need to delete bunch of records.? Except Backup, is one them.ÃÂ - What are Best Practice ? - Operations on DWH before start delete?
Tmap null pointer exception - by bipinkumarcse
[quote from =jlolling]Simply check which input column is nullable and is assigned to a none-nullable output column. These are potential assignments causing a NullPointerException.my all input columns and all output columns are nullable . i have used MD5 to create hash value and used multiple columns to generate this. can this cause to this exception . how to find out the root cause of this exception. like which columns or what data is leading to this so that i can handle that value.
Not able to download the zipped folder from exchange site - by shruti04mittal
Hi,
I am trying to download the component tRedirectOutput, but I am not getting any zipped file instead getting a file like below-
Please help me what might be wrong here.