Need a ONLINE MAINFREAME Course/Poject Case Study) / Corporate Training/ application support ????

Lets schedule the class. Write me @ mfdtech@gmail.com, for more information refer to : http://onlinemainframe.blogspot.com/
Iam Willing to take up Freelancing projects on Cobol / MF. please mail me @ mfdtech@gmail.com

Google
 

Monday, July 14, 2008


Future for Mainframes in SOA

While it is far too early to get an answer it is now possible to ask, as a serious rather than trivial question, whether there is scope to see new sales of zSeries mainframes grow into the SOA and web services-based market sectors as a result of new developments in integration tools, such as the recent appearance of DataDirect Technologies' major upgrade of its Shadow integration platform, for the traditionally back-office machines. Will it be enough to prove to be a positive lever on green field zSeries sales?

It might still be a brave person who answered ‘yes’ right now, the opportunity now exists where the mainframe can start to play a full part in a web-services/SOA environment, running applications that can proactively participate in that environment rather than just be a passive contributor to it. While it may not be natural to think of a mainframe as a core system in such an environment, Version 7 of Shadow does offer the potential to move IBM's zSeries systems well beyond the traditional role of back-end, back-office batch system.

Given that Java is a strategic part of the Shadow V7 enhancements the inherent I/O capabilities of the machine open up some interesting new possibilities in terms of managing high density real-time business applications such as online transactions in, say, retail or financial environments. In fact a wide range of Java applications can now exploit the high-capacity I/O capabilities of the mainframe. This could mean that the investment made in existing mainframe systems can now produce an almost infinite return. It can also make the notion of new mainframe investments for web-services and SOA-based applications and infrastructures quite apposite.

The addition of integration tools for widely used web service-related standards such as SOAP and XML will now make it possible for mainframe business logic and web services to communicate directly, allowing the mainframe to start to play a front-line role in providing important business process services to underpin web-service delivery in real time.

There are two useful tricks performed by DataDirect with Shadow. One is its abaility to integrate data directly with the common schema of web services, such as SOAP and XML, as well as the widely used SQL interface. This takes the mainframe to a new and much richer level of direct integration with the web services and SOA world. The second is Shadow's exploitation of the IBM zIIP and zAAP processors. Their arrival has already had an impact, allowing customers to run new and expanded workloads and, in particular, DB2 database workloads, on the mainframe in a cost-effective manner. The specific advantage here is access to a high speed processor engine that runs with none of the normal performance/pricing issues associated with the system in use. These engines are available to users for a one-off price rather than being geared to the mainframe processing capacity (MSU) pricing restrictions associated with the mainframe General Purpose Processor, yet are allowed to run at maximum speed.

Access to these processing engines is possible using either Task Control Block (TCB) or Service Request Block (SRB) threads. By developing patented technology that allows applications using TCB or SRB threads on the zIIP and zAAP processors, DataDirect has not only provided a significant performance improvement for existing applications but also opened zSeries machines up to some significant new operational possibilities. For example, expansion of the data integration tools to cover ODBC, JDBC and ADO means that new mainframe data queries to IMS, VSAM, IDMS and Adabas are now possible. The product will feature DataDirect's line of client drivers, which support ODBC 3.5.2 for 32-bit and 64-bit computing, JDBC 3.0 Type 4 for 32-bit and 64-bit database drivers and ADO.NET data providers. The SOAP and XML integration tools will allow parsing between mainframe business logic and web-services screen logic.

Many of these drivers are already being widely used by ISVs, which provides the added advantage of opening up the mainframe as a target platform for their applications offerings, widening both their market potential and the range of applications options available to existing zSeries users.

The key here is that, by allowing the IBM Java Virtual Machine (JVM) to run within the Shadow address space, it allows Java applications to run, at maximum system performance and no extra running costs, on the zIIP and zAAP engines. This way one of the problem areas of running Java applications on a zSeries machine—the operational expense—can be overcome, making the zSeries a viable option for running Java applications. Given the natural propensity towards applications with a high I/O demand, this should make zSeries machines an interesting new platform option for many typical java-based business applications.



For more info refer to URL : http://www.it-director.com/enterprise/content.php?cid=10009


Wednesday, June 4, 2008

Impact Analysis


Impact Analysis :

I hope Below informaiton will provid you brief information on various steps involved in impact analysis ( when ur asked to add a new field?)

Impact Analysis : This is a process to identify the various Copy books or Programs of Jcls or DB2 table etc are to be modified/impacted due the new changes , to meet the requirement.As per the requirement, programmer will have an idea, like what are the New fields/Existing field to be added/modify in

1. Copy Books

2. Files/Jcl

3. Database

4. In online Screen etc.


1. Impact Analsys On Copy Books (used by cobol programs) :

  • Identify the copybooks to be modified as part of requirement in list1

  • Now search for various copy books Impacted because of the copy books listed in list1 and note them into new list List2

  • Now search for various copy books Impacted because of the copy books listed in list2 and note them into new list List3

  • Above process is continued till the search returns zero copy books impacted.
    Merge all above lists and eliminate the duplicate copy books from them.
    Now find out the various programes which are using the above list of copy books and list them in Excel Sheet

2. Impact Analsys On Files/JCL :

  • Find out the Varous JCL/Procs impacted due to change in File length.

  • Find Out Varous JCLs/Procs impacted due to Sort cards/ Datalibs impacted due to above change.

3. Impact Analsys On DB2 tables :

  • Find out all the tables impacted due to addition of new table fields
    Find out the load cards impacted due to these changes

4. Impact Analsys On CICS screens :

  • Identify various screens/screen programs impacted due to addition of new fieldsTip : Use 3.13 option on mainframes to search a field in specfic lib.


walkinn for freshers

walkin for Freshers -- TCS HYD

Students who have just completed their graduation/post graduation can apply for trainee positions.) for technical writr position . Eligible candidates can walk in for a written test and interview at TCS, Deccan Park on Saturday, 07 June 2008, at 10:00 am.

.........-----------------------------------------------------
-----------------------------------------------------------------
Req for DELL
Hi all please forward your 2007 & 2008 passed out resumes to belowmentioned emails.Dell is planning to recruit more than 1000 new graduates for their new webmail development, in various locations. This is Great Opportunity for FreshGraduates to work with DellQualification:ØBE / B. Tech. - 2008 pass outs (Computer Science, Information Technology,Electronics and all Electronics, Electronics & Telecommunication,Electronics & Instrumentation etc.) Ø MCA / M.Sc. -2007/08: Only ComputerScience & Information Technology Branch Ø B.Sc./BCA - 2007/08: ComputerScience, Information Technology, Physics, Mathematics, Chemistry,Statistics, ElectronicsEligibility Criteria:ØGraduation 60% and AboveØ 10th/12th: 60% and aboveØ No Gap in educationØ Willingness to work in shifts, any platform & any location Ø NoSupplementary (un-cleared) subjects as on date Ø Good Communication SkillsIf you have resumes meeting the above criteria you can send it to:
Bangalore: offcampus.bangalore @dellmail. com
Chennai: offcampus.chennai@ dellmail. com
Hyderabad : offcampus.hyderabad @dellmail. com

Saturday, March 8, 2008

IBM DB2 UDB 701 Questions

Exam Name: IBM DB2 UDB V8.1 Database Administration

Exam Code: 000-701


Question: 1

Given the following command

Create database payroll alias paynew on path/drive

Using codeset codesst

Territory territory

Collate using identity

How is character data compared within the database?

A. Byte for byte

B. Based on the codest

C. Based on the codepage

D. Based on the territory

Answer: A


Question: 2

Given the following statement

Drop table payroll. Employee"

Returns the following message.

Solo55in "user" does not have the priviege to perform operation

"drop" on object "pavroll. Employee." Sqlstate=42501

Which of the following will correct the situation?

A. GRANT DROP AUTHORITY TO user1

B. GRANT DROPIN ON SCHEMA user1 TO user1

C. GRANT DROPIN ON SCHEMA payroll TO user1

D. GRANT DROPIN ON SCHEMA employee TO user1

Answer: C


Question: 3

Which of the following is required to support infinite active log space?

A. USEREXIT = ON, LOGSECOND = 0

B. USEREXIT = ON, LOGSECOND = -1

C. LOGRETAIN = RECOVERY, LOGSECOND = 0

D. LOGRETAIN = RECOVERY, LOGSECOND = -1

Answer: B


Question: 4

Which of the following commands can be issued in a Command Line Processor (CLP) session to

capture explain information for subsequent SQL statements that can be formatted using db

2exfmt?

A. SET CURRENT SNAPSHOT YES

B. SET CURRENT QUERY EXPLAIN

C. SET CURRENT EXPLAIN MODE EXPLAIN

D. SET CURRENT EXPLAIN SNAPSHOT EXPLAIN


Answer: C


Question: 5

The status of jobs scheduled by the TaskCentercan be monitored using which of the following?

A. Journal

B. Health Center

C. Snapshot Monitor

D. Information Catalog Center

Answer: A


Question: 6

Which of the following authorities can be used to following through database logo.But NOT

restore a backup image into a new database?

A. DBADM

B. SYSADM

C. SYSCTRL

D. SYSMAINT

Answer: A


Question: 7

Giving the following information:

1. A full database backup of DB1 was taken at 1:00 PM

2. A table space TS1 was backed up at 1:15PM

3. A table in TS1 was dropped at 2:00 PM

Which two of the following commands are valid after restoring table space TS1 from the full

database backup taken at 1:00PM?

A. ROLLFORWARD DATABASE db1 TO 1:15PM

B. ROLLFORWARD DATABASE db1 TO 2:30PM

C. ROLLFORWARD TABLESPACE ts 1 TO 2:30PM

D. ROLLFORWARD DATABASE db1 TO END OF LOGS

E. ROLLFORWARD TABLESPACE ts 1 TO END OF LOGS

Answer: B, D


Question: 8

A table is experiencing frequent inserts. Which of the following options can defer how frequently

DB2 must allocate additional index pages?

A. CLUSTER

B. PCTFREE

C. MINPCTUSED

D. SPECIFICATION ONLY

Answer: B


Question: 9

Which of the following requires each row in a table to be unique?

A. A foreign key

B. A bidirectional index

C. An index with include columns

D. An index created using PCTFREE

Answer: C


Question: 10

Exhibit:

Which of the statements and in what order must they be called to prevent normal users from

seeing SALARIES, but allow them to see NAMES?

A. 1

B. 2

C. 1,4,3

D. 1,5,3

Answer: C


Question: 11

Which of the following is automatically granted to PUBLIC when a database is created?

A. LOAD

B. PASSTHRU

C. CREATETAB

D. CREATE_NOT_FENCED

Answer: C


Question: 12

How dose DB2 enforce table check constraints for data added to table with the LOAD utility?

A. With the BUILD phase of the LOAD

B. With the SET INTEGRITY statement

C. With the DELETE phase of the LOAD

D. With the UPDATE CONSTRAINTS statement

Answer: A


Question: 13

Giving the following DDL statements:

CREATE VIEW v1 as SELECT col1 FROM tab1 WHERE col1>10

CREATE VIEW v2 as SELECT col1 FROM v1 WITH CHECK OPTION

CREATE VIEW v3 as SELECT col1 FROM v2 WHERE col1<100

Which two of the following statements will fall?

A. INSERT INTO v1 VALUES (5)

B. INSERT INTO v2 VALUES (5)

C. INSERT INTO v3 VALUES (5)

D. INSERT INTO v3 VALUES (100)

E. INSERT INTO v3 VALUES (200)

Answer: A, B


Question: 14

Which of the following DB2 utilities can create a table and place data into the table?

A. LOAD

B. INSERT

C. IMPORT

D. UPLOAD

Answer: C


Question: 15

Which of the following is required to use the IMPORT utility to import data into a table?

A. SYSCTRL authority

B. LOAD authority on the table

C. ALTER privilege on the table

D. IMPORT authority on the table

E. INSERT privilege on the table

Answer: E


Question: 16

Which two of the following commands would give the minimum point in time of recovery for table

spaces in a recoverable database?

A. GET DBM CFG

B. LIST MINIMUM POINT IN TIME

C. LIST TABLESPACES SHOW DETAIL

D. GET DB CFG FOR <database_name>

E. GET SNAPSHOT FOR TABLESPACES ON <database_name>

Answer: C, E


Question: 17

Which of the following commands should be run immediately after reorganizing a table?

A. REORG

B. RUNSTATS

C. DB2RBIND

D. FLUSH PACKAGE CACHE

Answer: B


Which of the following privileges is granted to PUBLIC on the system catalog views when a

database is created?

A. NONE

B. SELECT

C. UPDATE

D. CONTROL

Answer: B


Question: 19

Exhibit:

How many rows will be returned after issuing the following SQL statement?

SELECT* FROM PERSON?

A. 1

B. 2

C. 3

D. 5

E. 10

Answer: C


Question: 20

There is an instance on a server that needs to be discovered. The two databases in the instance

are PAYROLL and CERTIFY. The PAYROLL database should not be seen. Which of the

following will meet this requirement?

A. Set the DAS configuration parameter DISCOVER to DISABLE.

Set the DISCOVER_DB parameter in the CERTIFY database configuration file to ENABLE.

B. Set the DAS configuration parameter DISCOVER to SEARCH.

Set the DISCOVER_DB parameter in the PAYROLL database configuration file to DISABLE.

C. Set the DAS configuration parameter DISCOVER to SEARCH.

Set the DISCOVER_INST parameter in the PAYROLL database configuration file to

DISABLE.

D. Set the DAS configuration parameter DISCOVER to DISABLE.

Set the DISCOVER_INST parameter in the CERTIFY database configuration file to ENABLE.

Answer: B





Also look @ http://exam.testinside.com/000-701.htm for few more demo Qns



Page 1 of 4

Monday, February 25, 2008

NULL VALUES and NULL INDICATORS in DB2

In DB2, the columns defined as NULL needs to be handled carefully else it will throw null exception error, in order to over come this error data type can be handled by using null indicator.

  • NULL is stored using a special one-byte null indicator that is "attached" to every nullable column.

  • If the column is set to NULL, then the indicator field is used to record this.

  • Using NULL will never save space in a DB2 database design - in fact, it will always add an extra byte for every column that can be NULL. The byte is used whether or not the column is actually set to NULL. The indicator variable is transparent to an end user

Consider below Table :

Create Table SAMP_TAB

SN CHAR (10)

SNAME CHAR (10)

STATUS CHAR (2) NOT NULL BY DEFAULT

CITY CHAR (10) NOT NULL


Note :: Unless you specify NOT NULL, the default is to allow for NULL

In above table SN and SNAME columns holds null values by default, in order to handle these null variables we need to have NULL-INDICATORS declares in the Program as S9(4) comp variable (A indicator variable is shared by both the database manager and the host application. Therefore, this variable must be declared in the application as a host variable, which corresponds to the SQL data type SMALLINT)

Let us declare the Null indicators for above two variables in application program as

02 SNAME-INDNULL S9(4) comp

05 SN-IN S9(4) comp



What values Null indicators will hold :

  1. -1’ : Field is having NULL value

  2. 0’ : Field is Not NULL value

  3. -2’ : Field value is truncated

How /Why to handling Null Values:

  • When processing INSERT or UPDATE… statements, the database manager checks the null-indicator variable, if one exists. If the indicator variable is negative, the database manager sets the target column value to null, if nulls are allowed else it throws sql error code -305, we need null indicators to handle this situation.

  • If the null-indicator variable is zero or positive, the database manager uses the value of the associated host variable.



There are two reasons for getting -305 and Resolution :


1) If the table column is defined as NOT NULL (with no default) and if we try to insert a null value we get this error.

Resoulution :This should be resolved by making sure that the inserted value is not null. Null indicator cannot be used here since the column is defined as NOT NULL.

(validate the data, if its not numeric or less than spaces then move spaces into it and then insert or update into table)

2) A table column is defined as NULL, The host variable has a not null value and the Null indicator is not set in the host program, so the null indicator is defaulted to a negative value.

Resoulution :This should be resolved by using a null indicator in the host program and moving the relevant value to the null indicator. Here inorder to move null value into respective column nove -1 to null indicator.

Eg :

MOVE -1 to SNAME-INDNULL

EXEC SQL INSERT INTO SAMP_TAB

(SN,SNAME,STATUS,CITY) VALUE

(:SN,:SNAME:SNAME-INDNULL,:STATUS,:CITY)

END-EXEC

Eg :

EXEC SQL SELECT SNAME

INTO :SNAME:SNAME-INDNULL

FROM SAMP_TAB

WHERE SN = :SN-IN

END-EXEC


(Note : If SNAME has a value, SNAME-INDNULL contains 0.

If SNAME is NULL, SNAME-INDNULL contains -1. )

If SQL-CODE = -305 and SNAME-INDNULL = -1

Display “ SNAME is having null values “

Else

:

:


End-If


Important Points wrt NULL Variables

  • NULLs can present problems because they are handled differently by different computers and the collating sequence is inconsistent with regard to NULLs.

  • Unless you specify NOT NULL, the default is to allow for NULLs

  • It's easy for us to get lazy and allow columns to contain NULLs when it would be better to specify NOT NULL

  • Remember to allow for NULLs creating UNKNOWN logical values. Always test your code with NULLs in all possible places.

  • The NULL is a global creature, not belonging to any particular data type, but able to replace any of their values.

  • A NULL isn't a zero, it isn't a blank string, it isn't a string of length zero.

  • The basic rule for math with NULLs is that they propagate. An arithmetic operation with a NULL will return a NULL. If you have a NULL in an expression, the result will be NULL.

  • If you concatenate a zero length string to another string, that string stays the same. If you concatenate a NULL string to a string, the string becomes a NULL.

  • In comparisons, the results can be TRUE, FALSE, or UNKNOWN. A NULL in a row will give an UNKNOWN result in the comparison.

  • Sometimes negating the wording of the problem helps. Instead of saying "Give me the cars that met all the test criteria," say "Don't give me any car that failed one of the test criteria." It is often easier to find what you do not want than what you do want. This is very true when you use the NOT EXISTS, but beware of NULLs and empty tables when you try this.

  • You can't completely avoid NULLs in SQL. However, it is a good idea to try as hard as you can to avoid them whenever possible.

  • Make yourself think about whether you really need NULLs to exist in a column before you omit the NOT NULL clause on the column definition.

  • Use NULLs sparingly




Sunday, February 17, 2008

JCL - MAXI -VALUES



Question: HOW MUCH SPACE OS ALLOCATES WHEN YOU CREATE A PS OR PDS?
Answer: 56 KB

Question: MIN NO OF DATASET NAMES (PDS) IN ONE DIRECTORY BLOCK?
Answer: FOUR - SIX

Question: THE MAXIMUM NUMBER OF STEPS IN A JOB?
Answer: 255

Question: MAX. NO OF DD STATEMENTS IN A JOB
Answer: 3273

Question: What is the maximum number of instream procedures?
Answer: 15

Question: for how long a job can be executed continuously in a mainframe
Answer: 248 DAYS

Question: How much is memory space involved, when we code BLOCKSIZE,TRK & CYL
Answer: One block constitutes 32KB of formatted memory/ 42KB of Unformatted memory, 6 blocks makes one Track & 15 Tracks makes one cylinder.

Question: What is DSNDB06 ?
Answer: This is the Place where DB2 Catalog resides

Question: What is the use of DSNDB07 ?
Answer: This is the area where sorting takes place in DB2

Question: WHAT IS DATACOM DB?
Answer: IT IS A DATABASE USED WITH VSE.

Question: What are the maximum and minimum sizes of any CONTROL AREA (VSAM datasets) ?
Answer: Minimum Size : 1 trackMaximum size : 1 cylinder

Question: WHAT IS COMM?
Answer: COMM - HALF WORD BINARY

Question: What is a procedure?
Answer: A set of precoded JCL that can be modified through the use of parameters or override cards. Note: Procedures can be catalogued or instream.

Question: Q: HOW MANY TIMES SECONDARY SPACE ALLOCATED?
Answer: A: 122 TIMES

Saturday, February 16, 2008

DB2 Basics - Quick Refernce


  • Relational Database Management System(RDBMS)

  • Universal Database(UDB)


This is one of the sub systems in Mainframes.

Any number of Sub Systems can be created in Mainframes as per the requirements.


Hierarchy of DB2 Sub system:




TS – Table Space INS – Index Space

T – Table IND – Index

V – view


Maximum storage space for a Table Space is 64 million bytes.

SQL can be used to Create, Delete, and Update and Query the Objects


SQL queries can be executed by the following techniques

  1. Application programming

  2. Tools like QMF (Query Management Facility)

SPUFI (SQL Processor User File Input)


DATA TYPES :


Integer -- 4 bytes

Small int -- 2 bytes

Char(n) – N bytes

Varchar(n) – N+2 bytes

Graph(n) – 2n bytes

Vargraph(n)– 2N+2 bytes

Date – 10 bytes

Time – 8 bytes

Timestamp – 26 bytes

NORMALIZATION :


Arranging the data in the Database in organized manner.

1NF: Avoiding multiple values or set of values in one column.

2NF: Avoiding repeated rows by defining primary key.

3NF: Separating functionally dependent and non-functionally dependent columns

Primary key :-


  • Uniquely identified row

  • Which can be formed with single or multiple columns

  • Does not allow duplicate records

  • Cannot contain Null


Foreign key : -


  • Another identifier which can be used to build relationship between the tables

  • Must be the primary key of parent table with same data type & length

  • Can consists of single or multiple columns

  • Can contain Null or duplicate rows

  • Multiple foreign keys can be defined in one table

  • Foreign key should be defined at the time of defining child table in the create command by “WITH REFERENCES” option.


CREATE TABLE ITEM(

INO INTEGER,

INAME CHAR(15),

CNO INTEGER,

PRIMARY KEY IS INO,

FOREIGN KEY IS CNO

WITH REFERENCES CUST)




REFERENCE INTEGRITY:


The relationship between two tables which can be achieved by defining foreign key.


PRIMARY KEY

FOREIGN KEY

Cannot contain Null values or duplicate rows

Can contain Null values or duplicate rows

Cannot be updated

Can be updated

Can be defined as a foreign key in other table

which must be primary key of another table

only one primary key can be defined for one table

multiple foreign keys can be defined for one table



SQL(Structured Query Language)


  • DDL (Data Definition Language)

Create, alter, drop


  • DML (Data Manipulation Language)

Insert, update, select & delete


  • DCL (Data Control Language)

Grant, Revoke


  • TCL (Transaction Control Language)

Commit, Rollback



Some SQL Quries


Static SQL for Insert: Insert into cust(cno, cname, cloc) values (10, “xyz”, “hyd”)


Dynamic SQL for Insert: Insert into cust(cno, cname, cloc) values (v1, v2, v3)

v1,v2, v3 are host variables to be defined in working storage section.


Delete from cust

Delete from cust where cno = 20


Update cust set cname = “ xyz” where cno = 20


Select cno,cname from cust

Select * from cust

Select * from, cust where cno = v1

Select * from cust wehre cno=v1 and cname =v2

Select * from cust where cno between 20 and 60

Select * from cust where cname like “%y%”



Column functions:

Select max(sal) from emp

Select min(sal) from emp

Select avg(sal) from emp

Select sum(sal) from emp



Above statement returns Null values if no row exits for specified condition

To avoid duplicate rows : select distinct cno,cname from cust

To get total no. of rows : select count(*) from cust

Above statement returns Zeros if no row exits for specified condition



SUBQUERY:


  • Query within Query

  • First inner query executes & out query executes based on the result of inner query

  • Max of 15 sub queries can be coded

  • To simplify sub queries, logic can be built with combination of COBOL + SQL statements


To retrieve second maximum salary from emp table:

Select max(sal) from emp where sal <(select max(sal) from emp)


To retrieve third maximum salary from emp table:


Select max(sal) from emp

where sal < (select max(sal) from emp

Where sal < (select max(sal) from emp))


CO-RELATED SUBQUERY:


  • For every row of outer query, inner query must executes at least once

  • First outer query executes & then inner query executes

  • Practical examples : to fine top 2,3 or n salaries


Select a. sal from emp a

where 0 = (select count(*) from emp b

Where a.sal < b.sal)



  1. -- max

  2. – 2nd max

  3. – 3rd max

  4. – 4th max


DCLGEN :


Declaration Generator . a tool to generates the equivalent COBOL variables.

Which can be used to generate host variables with equivalent data types of DB2 columns.


DB2 Table DCLGEN COBOL





Host variables:


  • Can be used to pass the data from cobol program to DB2 table or DB2 table to COBOL program.

  • When host variables are coded with sql statements it must be prefixed with : like :hv-cname.

  • Table name must be supplied as input to DCLGEN & partition dataset should be as output.

  • After creating DCLGEN variables which must be copied to application program in WORKING-STORAGE SECTION by using include command i.e.

Exec sql

Inlcude custDCL

End-exec.

  • Include & copy have the same functionality


SQLCODE :


  • Predefined numeric number which can be used to check SQL statements for successful , unsuccessful execution.

  • SQLCODE can be stored in SQLCA(SQL Communication Area)

  • Copy SQLCA in WORKING-STORAGE SECTION

  • System defined variable

  • Evaluate or if statement must be coded to check the SQLCODE immediately after SQL statement.

  • SQLCODE =00 ---- successful

= +100 --- end of table or record not found.


Sample program:


WORKING-STORAGE SECTION.


EXEC SQL

INCLUDE SQLCA

END-EXEC


EXEC SQL

INCLUDE CUSTDCL

END-EXEC.

01 WS-SQL-CODE PIC S9(4)

88 88-SUCCESS VALUE 00

88 88-NOTFOUND VALUE 100

88 88-FORIENG KEY VOILATION VALUE –532

88 88- MULITPLE ROW VALUE –811




PROCEDURE DIVISION.


UPDATE CUST

SET CNAME = :HV-CNAME

WHERE CNO=:HV-CNO

MOVE SQLCODE TO WS-SQLCODE.

EVALUE WS-SQL-CODE

WHEN 88-SUCCESS

DISPLAY “SUCCESSFULLY UPDATED”

WHEN 88-NOTFOUND

DISPLAY “ RECORD NOT FOUND”

WHEN 88-FOREIGNKEYVOILATION

DISPLAY “ FOREIGN KEY VOILATION”

WHEN OTHER

DISPLAY “ ERROR OCCURRED IN UPDATE”

STOP RUN

END-EVALUATE.

STOP RUN.



CURSOR:


To retrieve multiple rows for a given condition.


Let us take the following example:


Exec sql

Select cno,cname,cloc

into :hv-cno,:hv-cname,:hv-cloc

from cust where cloc =:hv-cloc

end-exec.

If the condition satisfy for one row it executes successfully. If the condition satisfy for multiple rows it wont work. It returns –811 as SALCODE. For this we use cursors.


  • Cursors can be used to retrieve multiple rows for a given condition

  • Cursor cycle is Declare ---> Open ----> Fetch -----> Close

  • Declare: declares or define name for cursor against a table

Can be coded in working-storage section or procedure division

For better readability code in working-storage section.

  • Open: can be used to open a cursor with rows for a given conditions inbuffer.

Retireves data in to buffer

Must be coded in the procedure division only

Where condition value must be supplied before opening a cursor.

  • Fetch: can be used to retrieve rows one by one from buffer into application prog.

Which must be coded in procedure divison after open.

Must be coded with hostvariables

No of host variables in fetch & no of columns in the declare must be same

Canbe executed multiple times using perform. i.e. till EOT or record not found which can be identified by SQLCODE = 100


  • Close : used to close the cursor

Must be coded in procedure division only

Must be executed after open statement.


Practical examples : Can be used to retrieve the data based on loc, date, products.


EXEC SQL

DECLARE C1 CURSOR FOR

SELECT CNO,CNAME FROM CUST

WHERE CNAME=:HV-CNAME

END-EXEC.

EXEC SQL

OPEN C1.

END-EXEC.

PERFORM UNTIL SQLCODE= 100

EXEC SQL

FETCH C1 INTO :HV-CNO,:HV-CNAME

END-EXEC

END-PERFORM.

EXEC SQL

CLOSE C1

END-EXEC


For Update of where current of:


  • Which can be used to update row by row when multiple rows are satisfied.

  • Before update cursor has to be declared with for update of column option.

  • Where current of cursor name option must be used with update command

  • Withhold: this option can be used to remain cursors open even after commit statement.

  • Must be coded with cursor statement


EXEC SQL

DECLARE C2 CURSOR WITH HOLD FOR

SELECT CNO,CNAME FROM CUST

WHERE CNAME=:HV-CNAME

FOR UPDATE OF CNAME

END-EXEC.

EXEC SQL

OPEN C1.

END-EXEC.

EXEC SQL

FETCH C2 INTO :HV-CNO,:HV-CNAME

END-EXEC

EXEC SQL

UPDATE CUST SET CNAME=”ABC” WHERE CURRENT OF C2.

EMD=EXEC.

EXEC SQL

CLOSE C1

END-EXEC



INDEX:


  • Index allows duplicate values

  • unique index doesn’t allow duplicate rows

  • cross reference between index table & table is called clustered index.

  • Create index in1 on cust(cno)



PRIMARY KEY


INDEX

UNIQUE INDEX

Uniquely identified row

Record identified based on the index

Records identified based on the index

No duplicated rows, no null values

Duplicate rows, null values are allowed

No duplicate rows

Can consist of single or multiple columns

Dan consist of single or multiple columns

Can consist of single or columns

This will be stored in SYSKEYS.

This is stored in SYSINDEX

This is stored in SYSINDEX




VIEWS:


CREATE VIEW CVIEW(VCNO,VCNAME,VCLOC) AS

(SELECT CNO,CNAME,CLOC FROM CUST WHERE

CNAME LIKE “%X%)


  • Logical representation of the table

  • Stored in virtual memory

  • Can be derived from single table or multiple tables

  • Views are updateable if they are derived from single table without any column functions , group by

  • Multiple views can be generated from single table.

  • Views are stored in sysviews


Advantages of Views:


  • Data security

  • Data correctness

  • Logical data independence

  • Part of the information can be visible to the sers

  • Accessing can be faster.



DELETE RULES:


  • Delete rules can be applied for delete command against Database.

  • Delete rules are 3 types

      1. on delete cascade – all matching child rows will be deleted automatically when we delete parent row.

      2. on delete restrict – all matching rows will be restricted when we delete parent row which is default.

      3. on delete set null – all matching child row will be set to null when we delete parent row.


UNION:


  • UNION is used to concatenate rows into one table from single or multiple tables.

  • Rules : no. of columns & data type of both the queries must be same. column may be different

  • UNION can be used to eliminate duplicate rows

  • UNION ALL retrieved duplicate rows also.


SELECT CNO,CNAME FROM CUST WHERE CNO=10

UNION/UNIONALL

SELECT CNO,CNAME FROM ITEM WHERE INO=20


JOINS:


  • JOINS can be used to concatenate columns from one table or multiple tables.

  • JOIN types are :

      1. left outer join : which can be used to retrieve matching, non matching rows from leftside table

      2. right outer join: which can be used to retrieve matching, non matching rows from right side table.

      3. full outer join: which can be used to retrieve matching, non matching rows from both the tables.

      4. self join or inner join : can be achieved by defining alias for the table.

EXPLAIN :


It can be used to evaluate the performance of SQL queries.

It can be used to tune SQL queries.

Input is SQL queries and output is plan-table.

For every SQL query one plan-table will generate.

All plan-tables are stored in physical seq file.




Plan table


Query

block no

Table

Name

No. of

Cols

index

no.of

indexs

owner

join type

groupby

Cpu

time


1


Cust


10


In1


1


Custc


Self


Y


10 min



DB2 CATALOG:


  • Consists of Table pace, Index space, Index, unique index, Views, Alias, synonyms, keys.

  • When we create table, the details of table are entered in Systable automatically.


SysIBM.SYSTABLE


Table Name

No.of cols

Owner name

Created by

Created date

Created time

Cust

10

Abc

Xyz

02-apr-2004

0850

Item

15

Mno

Rst

06-apr-2004

1020


SysIBM.SYSINDEX



Index name

Table Name

No.of cols

Owner name

Created by

Created date

Created time

In1

Cust

10

Abc

Xyz

02-apr-2004

0850

In2

Item

15

Mno

Rst

06-apr-2004

1020


SysIBM.SYSCOLS


Col name

Table name

Index name

Primary key

Foreign key

Cno

Cust

In1

Cno

-----

Cname

Cust

In1

Cno


Cloc

Cust

In2

cno


Ino

Item


Ino

Cno

Iname

Item


Ino

Cno

Ides

Item


Ino

cno


SysIBM.SYSKEYS: All primary & foreign keys.


Grant table syscols to all

Grant table syscols(select/delete/update) to user1,user2

Revoke table syscols from all.


CATALOG:


SYSTABLE, SYSCOL, SYSKEYS, SYSINDEX, SYSPKS, SYSFKS, SYSALIAS, SYSSYNONYMS, SYSINDEX, SYSVIEWS,SYSTABLESPACE, SYSINDEXSPACE.


PRECOMPILATION PROCESS:


Pre compiler takes COBOL+DB2 program as input & generates DBRM which will be stored in userdefined PDS as separate member of Recln=80


DSNHPC --- IBM supplied utility used for precompilation.


Precompiler functions:


  • Separates SQL & COBOL statements

  • Check SQL syntaxs

  • Replace all SQL statements with host language call statements in the compiled program.

  • Which generates timestamp tokens


BIND:


BIND takes DBRM as input & generate package & application plan. The package will be loaded to the directory. Plan will be loaded to sysplans.


Bind functions:


  • Checks authorization for SQL statement

  • Checks the syntax errors of SQL statements like

  1. Missing co name in the select list & used in order by & group by

  2. Mismatch columns host variables

  3. Data type mismatch of columns & host variables

  4. Indicator variables not declared

  5. Data truncation.


BIND SUBCOMPONANTS/PARAMETERS:


  1. OPTIMIZER:

    • It generates optimized access path by analyzing the statistics of SQL statements which will be stored.

    • RUNSTATS utility is one of the ISPF panel option which is stored in DB2 defaults option.

    • Optimized path is stored in package which is not executable module.


  1. ISOLATION LEVEL:

Which can be used to lock the database at the time of executing SQL statements.


Cusrsor stability(CS): It acquires the address of a row. Sets the pointer to a specified row based on SQL query & acquires the lock against that row. Then releases the klock after the transaction before commit.


Repeatable Read(RR): which acquires the address of a row & acquire lock against the page(1 page -4024 bytes) & then released the lock after the commit statements.


Default is RR.


  1. RUNTIME SUPERVISOR:

Which is to oversee execution of SQL statements.

Statistics like no of tables, columns, indexes, keys


  1. PLAN/APPLICATION PLAN:

It consists of executable module which is actual output of SQL statements which must be specified in the RUNJCL to execute SQL queries if the program is batch program. If the program is online which must be specified in RCT. Application plan will be loaded to load module with time stamp tokens.


COBOL COMPILATION:

The compiler takes COBOL statement as input to generate object program, & loaded to the load module by line/edit with time stamp tokens.


UTILITIES USED:


DSNHPC : system utility pre compiler.

IKJEFT01 or IKJEFT01B --- BIND /Run Cob-DB2 Program

IGYCRCTL or IKFCBLOO --- COBOL compilation

IEWL or HEWL --- link/edit



INTERVIEW QUESTIONS: (watch this space for more info on below Qns)


  1. What is RI ? where did u use RI in your project? Explain with example?

  2. What is the difference between primary key, foreign key, index & unique index?

  3. Can we create index when table has duplicate rows?

  4. Can we create unique index when table has duplicate rows?

  5. Can we create index or unique index on empty table?

  6. What happens to the package when index is dropped? What’s the process or steps to be taken?

  7. How to delete package?

  8. Where package is stored? How to retrieve package?

  9. Difference between plan & package?

  10. What are the steps to be taken when SQL statements are changed without changing any COBOL statements?

  11. Do we need to pre compile when index is dropped?

  12. Can we bind when table is dropped?

  13. Can optimized access path consist of multiple DBRMS.

  14. What is the significance of timestamp tokens?

  15. What is the significance of normalization?

  16. Where do we specify run program, plan name, library & DB2 subsystem

  17. IBM supplied utility to run COBOL + DB2

  18. What is difference between DB2 & COBOL files & give some example for COBOL files & DB2 tables related to your project?

  19. Can we load data from sequential file to table or table to sequential file?

  20. What are the steps to be followed to develop DB2+ COBOL program?

  21. Can we prepare a program/compile when DB2 system is down?

  22. How to identify DB2 test or production system by seeing run JCL?

  23. What is the output of explain?

  24. What is the difference between correlated sub query & sub query?

  25. How to find 4th max sal?

  26. How to find nth max sal?

  27. How to evaluate SQLcodes & where it is stored?

  28. How to count total no of unique rows from the table?

  29. How to sum repeated rows?

  30. How to write a cobol program for above query? Retrieve row by row & use cobol logic to sum repeated rows?

  31. What is the significance of DCLGEN?

  32. Where DCLGEN is stored?

  33. Difference between join & union?

  34. difference between UNION & UNIONALL?

  35. Can be have different col names in union?

  36. How do u evaluate/tune/analyze SQL queries?

  37. What are the JCL utilities for compile, pre compile, bind & link edit?

  38. Wha5t is the significance of isolated levels?

  39. Can we alter foreign key?

  40. Can we alter primary key?

  41. Can we alter data type & length?

  42. What are the equivalent cobol variables for varchar?

  43. What is the time stamp & its format?