You have to be very careful running external processes, as they are spawned from the Data Pump 7 service, which is a .NET service, and as such, does not provide a UI to applications.
In the script test environment, you can see the UI, as it is an interactive app.
The other issue is that the process must return (i.e. exit) although you can set the timeout value, in which case it will attempt to kill the process and continue the job.
So, there could be 2 things happening here:
You have a command line process with the secure FTP client which does the job then exits.
This is a good thing and should work fine, but either the app may be taking an age to upload or for some reason not shutting down, telling MDP to continue with the job.
The command line you are using for the ftp client does not shut it down after it performs it's task.
In this case, MDP either waits forever, as the process never returns, or is cut off by a timeout you have set in MDP which produces a failure.
Let me know what secure ftp client it is, if it is free, then we could try and set it up here and try and get it working.
PS. In any event, you will never actually see the app appear on the machine, due to the fact that no UI is allowed.
I am working on running an external process in a script. This process uses a secure copy client to transfer a file. When I test the script using the test option, it works fine. When I go to run my process in DataPump, the process fails. Does anybody have an idea of why this would be? /b[/quote]
Unfortunately we do not have an SSH server to test on, could you e-mail me at email@example.com[/email] and perhaps set up a test account for us to use for testing. Include your phone so I can give you a call if necessary.
I am using PSCP (Putty Secure Copy) to do file transfers. /b[/quote]
I had an issue just like that. I could test the script and it would work fine. The issue for me was that when you let datapump run the job, it is using datapumps rights. For me when I RDP onto the server there is a G: drive. when datapump runs the job it doesn't see that g: drive. Its something to check out.
Now onto my question. I now have a batch file that I run in the postexport of a project file. I want it to produce a log file but the standard coding methods aren't working. '[batchName].bat >> d: est.log'. Stuff like that. If datapump can log what the batch file is doing that would be great. Otherwise how do I get it to create one? Here is the code I use so far in the script area.
<postexport>If Shell("""D: atchFilesACH_BATCH.bat""",AppWinStyle.Hide,True,90000) <> 0 Then ' TODO: Handle command timeout here. End If</postexport>