Identifying datapump jobs
TTITLE 'Currently Active DataPump Operations'
COL owner_name FORMAT A06 HEADING 'Owner'
COL job_name FORMAT A20 HEADING 'JobName'
COL operation FORMAT A12 HEADING 'Operation'
COL job_mode FORMAT A12 HEADING 'JobMode'
COL state FORMAT A12 HEADING 'State'
COL degree FORMAT 9999 HEADING 'Degr|(Num|Wrkers)'
COL attached_sessions FORMAT 9999 HEADING 'Sess'
SELECT
owner_name
,job_name
,operation
,job_mode
,state
,degree
,attached_sessions
FROM dba_datapump_jobs;
Killing or stopping a running datapump job
The difference between Kill and Stop is simple to explain. When killing a job, you won’t be able to resume or start it again. Also logs and dumpfiles will be removed!
When exporting (or importing), press Ctrl-c to show the datapump prompt and type KILL_JOB or STOP_JOB[=IMMEDIATE]. You will be prompted to confirm if you are sure…
Adding ‘=IMMEDIATE‘ to STOP_JOB will not finish currently running ‘sub-job’ and must be redone when starting it again.
Processing object type DATABASE_EXPORT/SCHEMA/TABLE/TABLE_DATA
[Ctrl-c]
Export> KILL_JOB
..or..
Export> STOP_JOB=IMMEDIATE
Are you sure you wish to stop this job ([yes]/no): yes
Resuming a stopped job
Now we can ATTACH to the job using it as a parameter to the expdp or impdp command, and a lot of gibberish is shown:
> expdp system ATTACH=EXP_FULL
(Re)start the job with START_JOB, use ‘=SKIP_CURRENT‘ if you want to skip the current job. To show progress again, type CONTINUE_CLIENT (Job will be restarted if idle).
Export> START_JOB[=SKIP_CURRENT]
Export> CONTINUE_CLIENT
No comments:
Post a Comment