Daily / Hourly archive log generation scripts - very handy.
Daily Archive Log Generation :
select trunc(COMPLETION_TIME,'DD') Day, thread#,
round(sum(BLOCKS*BLOCK_SIZE)/1024/1024/1024) GB,
count(*) Archives_Generated from v$archived_log
group by trunc(COMPLETION_TIME,'DD'),thread# order by 1;
Hourly Archive Logs :
set pages 1000
alter session set nls_date_format = 'DD-MON-YYYY HH24:MI:SS';
select trunc(COMPLETION_TIME,'HH') Hour,thread# ,
round(sum(BLOCKS*BLOCK_SIZE)/1024/1024/1024) GB,
count(*) Archives from v$archived_log
group by trunc(COMPLETION_TIME,'HH'),thread# order by 1 ;
** If you are running this on a dataguard env you need to run with applied='NO' or you get the log generation for both the primary and secondary and figure won't match what the o/s reports.
select trunc(COMPLETION_TIME,'DD') Day, thread#, round(sum(BLOCKS*BLOCK_SIZE)/1024/1024/1024) GB, count(*) Archives_Generated from v$archived_log where applied = 'NO' group by trunc(COMPLETION_TIME,'DD'),thread# order by 1
Find the amount of archiving you are using (run from the archive directory)
# cat arc_by_day.sh - run from the directory with archive - if this already split into dated directories just run a du -skh
#!/bin/bash
# check days
array=($(ls -la | grep rman_arc | awk '{print $6 $7}'| uniq))
for ((i=0; i<${#array[@]}; i++));
do
mth=`echo ${array[$i]} | cut -c1-3`
day=`echo ${array[$i]} | cut -c4-5`
chk="$mth $day"
echo -n "$chk "
# sum the files for the list of days
ls -la | grep "$chk" | grep rman_arc | awk '{sum+=$5}END{print sum/1024/1024/1024}'
done
# sum the total
echo -n "Total "
ls -la | grep rman_arc | awk '{sum+=$5}END{print sum/1024/1024/1024}'
[root@db03 arc]# ./arc_by_day.sh
Jun 14 34.2387
Jun 15 38.9001
Jun 16 109.296
Jun 17 49.5452
Jun 18 46.5221
Jun 19 32.9514
Jun 20 55.9935
Jun 21 13.1455
Jun 15 38.9001
Jun 16 109.296
Jun 17 49.5452
Jun 18 46.5221
Jun 19 32.9514
Jun 20 55.9935
Jun 21 13.1455
Total 380.592
[root@db03 arc]# pwd
/u02/backups/rman/db03/arc
Similar script to the ones earlier for ASM :
select substr(FIRST_TIME, 1,9), count(FIRST_TIME) from V$ARCHIVED_LOG group by substr(FIRST_TIME, 1,9) order by 1
No comments:
Post a Comment