Exercise 15-4

Team,

this solve Exercise 15-4,

mohamed El-safory

 

Java

Java

import java.awt.*;
import javax.swing.*;

public class Exercise15_4 extends JFrame {
public Exercise15_4() {
add(new MultiplicationTablePanel());
}

public static void main(String[] args) {
Exercise15_4 frame = new Exercise15_4();
frame.setSize(300, 400);
frame.setTitle(“Exercise15_4″);
frame.setDefaultCloseOperation(JFrame.EXIT_ON_CLOSE);
frame.setLocationRelativeTo(null); // Center the frame
frame.setVisible(true);
}
}

class MultiplicationTablePanel extends JPanel {
protected void paintComponent(Graphics g) {
super.paintComponent(g);

int x = 10;
int y = 40;
String s = “”;
int i = 0;

// Display the title
g.setColor(Color.red);
g.setFont(new Font(“Times”, Font.BOLD, 20));
g.drawString(“Multiplication Table”, x+50, y);

g.setFont(new Font(“Times”,Font.BOLD,15));

y += 30;
for (i = 1; i < 10; i++)
g.drawString(” ” + i, x + 10, y + 10 + i * 20);

x += 40;
for (i = 1; i < 10; i++) {
s = s + ” ” + i;
}
g.drawString(s, x, y);

y += 10;
g.drawRect(x, y, 200, 200);

s = “”;
y += 20;

for (i = 1; i < 10; i++) {
for (int j = 1; j < 10; j++) {
if (i*j < 10)
s = s + ” ” + i * j;
else
s = s + ” ” + i * j;
}

g.drawString(s, x, y);
s = “”;
y += 20;
}
}
}

ERP Business Analyst (Job opportunity)

HI all,

There are ERP Business Analyst Job vacation at USA.

if you interested please let me know. and send to me (eng_safory@yahoo.com) your updated CV.

Job Description

Under general supervision, is responsible for creating and maintaining business processes by providing process analysis, needs assessments, infrastructure requirements and cost/benefits analysis in an effort to align informationtechnology solutions with business initiatives. Provides references for users by writing and maintaining user documentation, providing help desk support and training users. Also responsible for providing in-depth Project Management for selected projects. This job works closely with all constituents to identify and maximize opportunities for using data warehouse systems to improve business processes, promote the strategic use of information technology, and enable analysts and management to use these systems for improved, timely decision making. This job provides the leadership, vision, and direction for data warehouse and business knowledge systems, ensuring support of the company’s business objectives and requirements.

Collaborate with Executive, Business unit, and ITS Management to translate corporate/functional business and information objectives into data warehouse business intelligence and  translate them to strategic/tactical business plans and systems development. Analyze JD Edwards ERP and Legacy data sources and design a detailed mapping to the datamart schema. Identify all the rrequired business dimensions, attributes, facts, measures and Key Performance Metrics based on reporting and business needs and facilitate rapid development of GUI for the Enterprise Reporting Initiative (ERI). Perform data modeling (Conceptual, Logical and Physical) based on Star, Snowflake and hybrid design suitable for operational reporting, Analytical reporting needs. Design the Architectural framework required to support the analytical needs using Informatica, OBIEE and DB2 UDB. Manage the overall project which includes a team of onsite and offshore resources.

Independently design and create repository (RPD) using OBIEE 11G Tool. Develop repository objects such as physical data source objects, connection pools, Business Model Mapping Layer,  Web Catalog Objects as per the reporting requirements using OBIEE Admin and Web tools. Create reports and dashboards to visualize and add instructiveness to the information using Graphs, Charts, Pie Charts, Gauges. Create user friendly dashboard and analysis. Perform unit testing and integration testing of developed reporting objects and document the results. Validate the business model in Oracle BI Administration Tool by performing consistency check, validating logical source tables, logical columns, and also validating repository level calculations done in the business model mappinglayer. Implement Fragmentation, Aggregate Navigation &amp; Level based dimensions in the BI repository.Develop efficient data load processes using Informatica version 9.x tool. Create  ETL mappings and scripts using Informatica power center designer tool objects such as source analyzer, target designer, mapplet designer, transformation developer objects, repository manager, workflow manager. Perform unit testing and integration testing of developed code and document the results.Manage and direct on-shore and off-shore application developmeent team to develop and deliver applications assuring proper solutions are developed and tested for user review.

Perform ETL Code, SQL Code and Reporting code tuning to get desired performance at all levels.Candidate should ensure accuracy, efficiency and completeness of development done by other project team members working onsite and offshore.Remain knowledgeable of current IT and company business operations and processes.  Maintain professional and technical knowledge by attending educational workshops; reviewing professional publications; establishing personalnetworks; benchmarking state-of-the-art practices; participating in professional societies. Provide training and ongoing application support including End User support.  Maintain user confidence and protect operations by keeping relevant information confidential.  Support IT application group using analytical skills.

Desired Skills and ExperienceBachelor&#039;s Degree required (Business administration/Information Technology preferred) with a business background and the equivalent of at least eight years&#039; experience in process design and documentation in a related industry. Extensive working experience in OBIEE 11G and Informatica 9.x ETL tool sets with atleast 7 to 10 years of hands on experience in these tools. Very good understanding and usages of various Transformations like Expression, Filter, Joiner, Lookup, Update Strategy, and Router etc. to transform, migrate and clean data. Heavily worked on and workflow monitor. Very good understanding of DB2 SQL Script writing and Procedures.Literate with computer programs, the Microsoft Office suite a must.Excellent interpersonal skills.Ability to adapt to new software and technology.Familiar with andd have command of software documentation tools.Ability to perform System Analyst duties.Provide user support and application support

 

Optimize a SQL Server Analysis Services Measure Group Partition for Performance

Problem

In SQL Server Analysis Services(SSAS), each measure group by default has at least one partition. When a cube is created, this partition would not have any aggregation schemes defined, so these partitions are not optimized for performance. As you know, aggregations are pre-calculated sets of data which improve query response time and calculations during query evaluation. In this tip we wil learn how to design aggregations for a partition and optimize it for performance.

Solution

The Aggregation Design Wizard in SSAS is the easiest way to design an aggregation for a partition. For the purpose of this tip, we would use the project that we have developed using the SSAS Tutorial. From this tutorial, we already have a SSAS Project, a Cube and two measure groups. Based on these objects, follow the steps below to create an aggregation design. Continue reading

Using XMLA Command to Clear Cache of a SQL Server Analysis Service Database

Problem

One of the Business Intelligence developers in my company approached me yesterday with a dilemma. He wanted to know if there was a way to clear the cache of an Analysis Services database other than by recycling the Analysis Services service in SQL Server. At first I started to tell him, but figured it would be smarter to document the same and share the information. Below is the a process that can be used for SQL Server 2005 and alter versions. Continue reading

Server ‘SERVERNAME1′ is not configured for RPC for a Linked Server

Once in while I am asked to troubleshoot a SQL Server database where my only connection is though a linked server.  Because this database server is on protected network, I don’t have port 1433 open to connect the instance with SQL Server Management Studio.

There are a couple of commands I like to run to check the health of the database.

First, already knowing that the database is running, I like to look at the error log with the xp_readerrorlog extended stored procedure. Continue reading

Introduction to Table Partitioning

Divide and conquer is what Oracle was thinking with this one. Table partitioning is about optimizing “medium selectivity queries”. The Oracle database has optimization techniques for high selectivity queries, with the use of indexes. If we need to process all data in a big table, we have to live with the fact it will take a while, but the engine will process the data as fast as possible. However, a medium selectivity query needs just a portion of the data, for instance a tenth. This is to much data for indexes, to little data for full table scans so the processing time might become rather long regarding the outcome.

Let me give you a brief explanation of how Oracle collects its data, and how table partitioning can help for these queries. Do note this is not for everyone; “Oracle Partitioning” is an extra cost option, for Enterprise Edition only.

Access methods in a nutshell

Oracle has two commonly used table access methods, “full table access” and “access by rowid”. For the first method, Oracle reads all the blocks in a table, and applies filters afterwards. This might look like a lot of overhead, but multiblock reads are used for reading large amounts of data in bulk. The access by rowid is mostly used in conjunction with indexes. The index scan returns a rowid, and a single block read is used to get the block we’re interested in. This looks like an interesting method, because we read a lot less data. But when we need a lot of records, this means a lot of small operations that need to be set up, which brings a lot of overhead. Depending on your data, you’re better off with multi block reads when you’re fetching more than 5% to 10% of the table.

Screenshot: Oracle multi block and single block reads Continue reading

SQL SERVER – Find Row Count in Table – Find Largest Table in Database

It is very good to see excellent participation there. In my script I had not taken care of table schema. SQL Server Expert has modified the same script to include the schema. Here is the new modified script.

SELECT sc.name +'.'+ ta.name TableName
,SUM(pa.rows) RowCnt
FROM sys.tables ta
INNER JOIN sys.partitions pa
ON pa.OBJECT_ID = ta.OBJECT_ID
INNER JOIN sys.schemas sc
ON ta.schema_id = sc.schema_id
WHERE ta.is_ms_shipped = 0 AND pa.index_id IN (1,0)
GROUP BY sc.name,ta.name
ORDER BY SUM(pa.rows) DESC Continue reading

ESTIMATED TIME FOR BACKUP / RESTORE

This script can be used to find the estimated time of backup and restore that is on progress in your SQL server. This script is applicable for SQL server 2005 and above.

Script

1
2
3
4
5
6
7
8
9
SELECT r.session_id,r.command,CONVERT(NUMERIC(6,2),r.percent_complete)
AS [PERCENT Complete],CONVERT(VARCHAR(20),DATEADD(ms,r.estimated_completion_time,GETDATE()),20) AS [ETA COMPLETION TIME],
CONVERT(NUMERIC(6,2),r.total_elapsed_time/1000.0/60.0) AS [Elapsed MIN],
CONVERT(NUMERIC(6,2),r.estimated_completion_time/1000.0/60.0) AS [ETA MIN],
CONVERT(NUMERIC(6,2),r.estimated_completion_time/1000.0/60.0/60.0) AS [ETA Hours],
CONVERT(VARCHAR(100),(SELECT SUBSTRING(TEXT,r.statement_start_offset/2,
CASE WHEN r.statement_end_offset = -1 THEN 1000 ELSE (r.statement_end_offset-r.statement_start_offset)/2 END)
FROM sys.dm_exec_sql_text(sql_handle)))
FROM sys.dm_exec_requests r WHERE command IN ('RESTORE DATABASE','BACKUP DATABASE')

Sample Output

session_id command Percent Complete ETA Completion Time Elapsed Min ETA Min ETA Hours
52 BACKUP DATABASE 95.76 2008-02-08 08:09:48 0.16 0.01 0.00 Backup database AdventureWorks to disk=’c:adw.bak’

Encrypt a Column of Data

some times we need to encrypt the data for more security, this following script will help us to make encrypt at columns

–If there is no master key, create one now.
IF NOT EXISTS
(SELECT * FROM sys.symmetric_keys WHERE symmetric_key_id = 101)
CREATE MASTER KEY ENCRYPTION BY
PASSWORD = ’23987hxJKL95QYV4369#ghf0%lekjg5k3fd117r$$#1946kcj$n44ncjhdlj’
GO

—-Create Certificate

CREATE CERTIFICATE Sales09
WITH SUBJECT = ‘Customer Credit Card Numbers’;
GO

CREATE SYMMETRIC KEY CreditCards_Key11
WITH ALGORITHM = AES_256
ENCRYPTION BY CERTIFICATE Sales09;
GO

– Create a column in which to store the encrypted data.

ALTER TABLE dbo.DimCustomer
ADD EmailAddress_Encrypted varbinary(4000);
GO

– Open the symmetric key with which to encrypt the data.
OPEN SYMMETRIC KEY CreditCards_Key11
DECRYPTION BY CERTIFICATE Sales09;

– Encrypt the value in column CardNumber using the
– symmetric key CreditCards_Key11.
– Save the result in column EmailAddress_Encrypted.
UPDATE dbo.DimCustomer
SET EmailAddress_Encrypted = EncryptByKey(Key_GUID(‘CreditCards_Key11′)
, EmailAddress, 1, HashBytes(‘SHA1′, CONVERT( varbinary
, CustomerKey)));
GO

– Verify the encryption.
– First, open the symmetric key with which to decrypt the data.

OPEN SYMMETRIC KEY CreditCards_Key11
DECRYPTION BY CERTIFICATE Sales09;
GO

– Now list the original card number, the encrypted card number,
– and the decrypted ciphertext. If the decryption worked,
– the original number will match the decrypted number.

SELECT EmailAddress, EmailAddress_Encrypted
AS ‘Encrypted card number’, CONVERT(nvarchar,
DecryptByKey(EmailAddress_Encrypted, 1 ,
HashBytes(‘SHA1′, CONVERT(varbinary, CustomerKey))))
AS ‘Decrypted card number’ FROM dbo.DimCustomer;
GO

Encrypting Column Level Data in SQL Server

The ability to encrypt data natively using t-sql was provided in SQL Server 2005 with the introduction of SQL Server cryptographic services.  The magic behind this feature originates in the operating system with the data protection api, DPAPI.  The first time an instance of SQL Server is started the “service master key”, SMK, is created.  The SMK is a 128-bit 3DES key which is encrypted using the DPAPI and the credentials of the SQL Server service account.  Once created the SMK is used to encrypt all “database master keys”, DMK’s, and various server side resources, credentials, linked server logins, etc.  The ability to backup, restore, and regenerate the SMK is available through t-sql:

BACKUP SERVICE MASTER KEY
TO FILE = 'C:\SMK\service_master_key'
ENCRYPTION BY PASSWORD = 'Pa$$w0rd';
GO Continue reading