Wednesday, 23 September 2015

Create ISO Boot Image Using ISOLinux and MKISOFS Uitility on Linux Box

Dear Viewers

This post will help you to create ISO boot image using ISOLinux and mkisofs utility on Linux box.

Prerequisite:
1. Syslinux archive.
2. Binary files, source code files and other required configuration file of any preferable  Linux flavor.
3. mkisofs utility installed.


Tuesday, 22 September 2015

Make and Cmake Utility Example in Linux


Dear Viewers

This post will help you to use Make and Cmake utility in Linux for auto compiling and building C language project. 

Prerequisite:
1. Basic knowledge of C language. 
2. GCC compiler on your Linux box. 
3. Make and Cmake utility installed. (On Linux terminal, type man make and man cmake for know more details on Linux terminal) 


Concept:
In your daily routine you might have come across a situation where a big software application need to be split-up into small-small module.
There is a possibility that these each one of these modules have thousands lines of source code written in them. Imagine that these modules 
have been developed by different-different developer who actually sits on different-different locations. In such scenario, whenever a developer 
make changes to source code, he is expected to compile that code manually. Same applies to other developers. This actually seems to be a tedious 
job considering each time manual compilation. Can we think for solution to this problem? Yes. We have a solution for this problem. That is Make 
and Cmake utility. Both are explained below with their practical demonstration. 
 

Monday, 21 September 2015

Hadoop MapReduce WordCount Program in Java

Dear Viewers,
 
This post will help you to run Hadoop MapReduce Word Count program in Java. 

Prerequisite:
1. Running Hadoop environment on your Linux box. (I have Hadoop 2.6.0)
2. Java installed on your Linux box. (I have Java 1.7.0)
3. External jar - hadoop-core-1.2.1.jar
4. Text input file  (I have Inputfile.txt)

Flow:
1. Prepare 3 Java source code files namely – WordCount.java, WordMapper.java, WordReducer.java
2. WordCount.java is a main class. You may also refer this as a Driver class. From it's source 
code, it refers to WordMapper.class and WordReducer.class
3. WordMapper.java file splits up the user input and as an output generates <key,value> pair. 
That is <word, and its count>.
4. WordReducer.java accepts output of Mapper as an input. It means it combines output provided 
by WordMapper.class and generate final output which is also a <key,value> pair. In final output 
it indicates how many time each word has occurred. 
5. Compile source code files and make use of external jar file. 
6. Post successful compilation, create jar file by putting together all .class files
7. Run your program. Syntax to be followed while running this program is as below. 
$ hadoop jar jar-file-name Driver/main-class-name Input-file-name-on-hdfs 
Output-file-directory-on-hdfs
7. Output file on HDFS generated by Mapper class will have a name “part-m-00000” and Reducer 
class will have a name part-r-00000. So open file part-r-00000 from terminal to see final output. 

Sunday, 20 September 2015

Hadoop MapReduce WordCount Program in Python

Dear Viewers,
This post will help you to run Hadoop MapReduce Word Count program in Python.



Prerequisite:
1. Running Hadoop environment on your Linux box. (I have Hadoop 2.6.0)
2. Python installed on your Linux box. (I have Python 2.7.3)
3. External jar - hadoop-streaming-2.6.0.jar
4. Text input file (I have Employee.txt)



Flow:
  1. Prepare 2 python source code file namely - Mapper.py and Reducer.py
  2. Mapper.py file splits up the user input and as an output generate <key,value> pair. That is <word, and its count>.
  3. Reducer.py accepts output of Mapper as an input. It means it combines output provided by Mapper.py and generate final output which is also a <key,value> pair. In final output it indicates how many time each word has occurred.
  4. To run these files from Linux terminal, change the permission of these files to executable permission. To do so, you may use command  $chmod +x *.py
  5. Syntax to be followed while running this program is as below. '\' on terminal indicates line continuation.
    $ hadoop jar hadoop-streaming-2.6.0.jar \
    -file filename -mapper mapperfile \
    -file filename -reducer reducerfile \
    -input inputfilename \
    -output outputfiledirectory
  6. Output file on HDFS will have name “part-00000”. So open this file from terminal to see final output.

Monday, 10 August 2015

Sqoop import and export between Mysql and HDFS

Dear Viewers,

Following script will help you to perform import and export between MySql and HDFS using Sqoop.

# Script Starts here SqoopJobs.sh ########

#!/bin/bash
# Title: Shell script to perform import and export between MySQL to HDFS and vice-versa using Sqoop
# Author: Pavan Jaiswal
#Note: This program can be made using shell script or Java. I have chosen shell script here.

Friday, 7 August 2015

Data analysis using Hive script

Dear Viewers,

This post will help you to run hive script on hive table. Hive script is made for below given title.

Hive script can be saved with the extension '.hql' or '.sql' . Copy paste the below code in a file "HiveScript.hql". In hive script, single line comment is given by '--'.


OSD question bank first three units

Unit 1: Introduction to UNIX OS

1. Draw and explain UNIX architecture.
2. Differentiate monolithic, micro and exo kernel.
3. Explain booting process in detail.
4. Compare and contrast GRUB-1 and GRUB-2.
5. Explain with neat diagram buffer header.
6. Write and explain algorithm getblk().
7. Explain algorithm bread() and bwrite()
8. Explain algorithm iget() and iput()
9. Draw and explain UNIX file system architecture.
10. What do you understand by free space management? Explain its different approaches.
11. What is the importance of swapping in OS?


Thursday, 6 August 2015

Distributed Programming - questions and solution set (unit 1)

Sincere Thanks to Ms. Shrida Kalamkar for preparing the solution set.

Distributed Programming - presentation

Hive installation and configuration

Dear Viewers,

This post will help you to setup and configure hive on Hadoop and Fedora 17


Introduction:
Apache Hive is a data warehouse architecture which is built on top of Hadoop. Many people refers Hive as Hadoop ecosystem. Hive is useful to perform data summerization and data analysis. Hive supports SQL like language which is called as Hive Query Language (HQL).


Monday, 3 August 2015

Basic Hive Queries

Dear Viewers,

This post will help you to execute basic queries on Hive.

Operations performed are:
1. Create database
2. Create tables
3. Alter table
4. Drop table
5. Drop database


Sunday, 2 August 2015

First Hive script on Hadoop cluster

Dear Viewers,

This post will help you to write and execute your first Hive script from Linux terminal. Before we go ahead, you have to start hadoop. You can start it by calling start-all.sh file available in hadoop directory. To run Hive script, you must have Hive directory available on your system.

In my case I have hadoop available at /usr/local/hadoop-2.6.0 and Hive at /usr/local/hive-0.11.0


Friday, 31 July 2015

Hadoop 2.6.0 Single Node Setup on Fedora

Dear viewers,

This post will help you to single node setup of hadoop 2.6.0 on Fedora or similar systems.

Steps:

1. Install Java if not exists. (Version 1.6 +)
use sudo yum install java-package-name
After installation, cross check it by below commands

[pavan@Pavan ~]$ java -version
java version "1.7.0_b147-icedtea"
OpenJDK Runtime Environment (fedora-2.1.fc17.6-x86_64)
OpenJDK 64-Bit Server VM (build 22.0-b10, mixed mode)

[pavan@Pavan ~]$ which java
/usr/bin/java

[pavan@Pavan ~]$ whereis java
java: /bin/java /usr/bin/java /etc/java /lib64/java /usr/lib64/java /usr/share/java /usr/share/man/man1/java.1.gz


Run JSP Servlet in Apache Tomcat No Eclipse

Dear viewers,

This post will help you to run JSP and Servlet in Apache Tomcat without using any IDE (for e.g. Eclipse, NetBeans)

Prerequisite:
1. Apache Tomcat
  - I have apache-tomcat-7.0.63 on my Fedora machine.
2. Java
  - I have java-1.7.0-openjdk-1.7.0.3.x86_64 installed on my machine


Wednesday, 29 July 2015

Java program to perform 64-bits numbers multiplication using shared memory concept

/*
    Title: Java program to perform 64-bits numbers multiplication using memorymapped files and utility classes
    -- Memorymapped files have been used to simulate shared memory concepts in Java. Multiple threads can be though of other alternative here
    -- 19 digits input is accepted which in turns represent 64-bits
    -- School multiplication methos is applied
    -- For simplicity block of 8 bytes is used for multiplication
*/


Wednesday, 15 July 2015

Mongodb commands to play with databases and collections

Dear Viewers,

This post helps to create and drop database, create and drop collections in mongodb.

I assume that you have mongodb and required JDK version present on your Linux machine. For demonstration purpose, I have used Fedora 17.


Wednesday, 24 June 2015

MySql stored procedure and stored function

Dear Viewers,

This post demonstrates MySql stored procedure and stored function. Small dataset considered is as below

mysql> select * from student;
+------+---------+-------+----------+
| rno  | name    | class | division |
+------+---------+-------+----------+
|    1 | John    | TE    | A        |
|    2 | Peter   | TE    | A        |
|    1 | David   | TE    | B        |
|    2 | Solomon | TE    | B        |
+------+---------+-------+----------+

Monday, 22 June 2015

Shell script to perform simple file operations

Dear Viewers
Below shell script handles basic file operations. Linux commands used to perform operations are:
1. echo - to print on standard output
2. read - to read user input
3. touch - to create a file
4. mv - to rename a file
5. rm - to remove a file
6. ls - to list the contents of current directory  


Thursday, 18 June 2015

Basic MySql commands for beginner

Dear Viewers,

This post is presenting basic MySql commands useful for MySql beginner. For demonstration purpose I have used MySql installed on Ubuntu 12.04.

1. Check mysql service status
pavan@ubuntu:~$ service mysql status
mysql start/running, process 1583

2. Stop mysql service
pavan@ubuntu:~$ sudo service mysql stop
mysql stop/waiting


Tuesday, 9 June 2015

"Backspacce" key not working in Fedora nautilus

Dear Viewers,

While using Fedora distributions, you might have experienced that "Backspace" key on nautilus do not work. In simple words, nautilus means filesystem hierarchy under "/". So every time to visit or to go back to previously accessed folder, you have to make use of mouse and click on "Back" button available on left top corner. So if you wants to get rid out of this and wish to use shortcut key, then follow the process given below:

1. Open terminal
2. Type in  vi /root/.config/nautilus/accels 
3. Replace the user root by the user name for whom you would like to apply these settings. Important is you have to provide absolute path.
4. Go into "Insert" mode on VI, and copy paste below line as it is
 (gtk_accel_path "<Actions>/ShellActions/Up" "BackSpace") 
5.Press "Esc" then ":" and type "wq"
6. This will save the changes made and will take you back on terminal
7. On terminal type either "nautilus -q" or "killall nautilus"
8. Now open any folder, jump to any directories and cross check whether "Backspace" started working. 
9. You are done.

All The Best.....

Memory Management and Virtual Memory - presentation

Java Programming Concepts - presentation

Monday, 8 June 2015

Linux administration tools and commands

Dear Viewers,

You may use following links where Linux administration tools and utilities have been explained with examples so well.

Link 1:
http://www.tecmint.com/60-commands-of-linux-a-guide-from-newbies-to-system-administrator/

Link 2:
http://www.tldp.org/LDP/abs/html/system.html

Link 3:
http://www.reallylinux.com/docs/admin.shtml

Wednesday, 3 June 2015

Operating Systems Design - syllabus

Class: TE Computer                                                                                  Subject Code: 310242 

Course Objectives
· To learn the Operating System Booting Process
· To learn advance file system and operating system management
· To learn init() process and other essential boot processes
· To learn use of GRUB2

Course Outcomes
· Ability to use EFI based x64 Operating Systems
· Ability to use x64 based File Systems and Managers


Friday, 30 January 2015

My First Textbook on "Operating Systems" for Third Year Computer Engineering and IT Students of North Maharashtra University, Jalgaon.


Thursday, 29 January 2015

Embedded Operating Systems study material

Dear Viewers,

Please visit below URL to see and use study material of subject - "Embedded Operating Systems", TE Computer, SPPU. Feel free to put your query on below blog.

http://eoscompviit.blogspot.in/