[FSF Associate Member] View Annang Sowah's profile on LinkedIn

Friday, 21 March 2014

[SOLVED] NTFS Disk Mount Issues on Linux on a "Dual-Booted" Computer.

Good day, Fellas!

Background: I made an installation of Windows Server 2012 on an NTFS partition I originally allocated from an existing Ubuntu Linux 13.10 disk (dual-booting means getting a computer host 2 Operating Systems).
Upon accessing the NTFS disk from  the Ubuntu Linux OS I had an error message that implied that the disk has an unclean file system or is in an unsafe state.Below is the error thrown when the host Ubuntu Linux OS attempts to mount the NTFS partition.

Fig.1  Error encountered whiles accessing NTFS partition

Cause: This is caused by a recent tweak of the Microsoft OS(Windows 8, Windows Server 2012) simply called Fast-Startup ( which involved the saving of device(RAM, CPU, Disk) boot metadata(e.g. registry boot parameters) on files during shutdown, to be accessed at the next startup to shorten boot-time) which puts a "lock" on the NTFS partitioning hence posing a difficulty when opening disk.

FROM WINDOWS:  From your windows OS environment,  access the Power Options under control panel. Move down to the shutdown settings panel where you can disable the Fast-Startup by dis-selecting the "Turn on Fast startup".

FROM LINUX: We would use the software package called ntfs-3g which has a powerful binary called ntfsfix. Lets first install the package using the super-user priviledge and command below.

# sudo apt-get install ntfs-3g

Lets next confirm if the installation has been successful by checking the man pages(i.e. utility  manual pages) for ntfsfix 

# man ntfsfix

Fig.2  NTFSFIX utility manual on Linux.

Now, lets proceed to run the utility to fix the issue snapped above by simply :) running

# sudo ntfsfix /dev/sda3
Where sda3 is the ntfs partition of the disk that you have difficulties opening.

Fig.3  NTFSFIX utility run  on Linux without errors.

Verify if the partition is accessible - as I have done below.

Fig.4  Partition is now fully accessible.

Issue solved, hopefully :)

Tuesday, 18 February 2014

[SOLVED] How to inter-connect Oracle Virtualbox Virtual Machine Guest Computers

Many times there is the need to create a list of virtual servers with a goal of interconnecting each guest VM to other peer VMs on the same Virtualbox instance.

Background: I was in the process of installing 3 virtual servers on an Ubuntu Linux system viz: Hyperion Oracle Database(OLTP) server, Essbase OLAP server and an Oracle Enterprise Performance Management (EPM) server as middleware.
There is the technical requirement to interconnect these Windows Server OS servers in an enterprise environment using the "all-powerful" Virtualbox virtualisation tool.

Fig.1 Virtual Guests Servers on VirtualBox instance needing inter-connectivity.

1. Configure the network adapter of the guest VMs to be attached to an "Internal Network" on the network section of "Adapter 1".

Fig.2 Network adapter settings

2. Set up Virtualbox's DHCP server and add a connectivity IP range for the use of guest virtual machines. This configuration simulates a virtual router environment.

 This is done by running the command below which pegs the lower IP assignable to and the upper to
It also assigns the virtual "DHCP server" the IP as scripted below for your use.

VBoxManage dhcpserver add --netname intnet                      
--ip --netmask                       
--lowerip --upperip --enable          

Fig.3 command run to set up dhcpserver and accompanying settings

3.Put the firewall of the guest VM Operating Systems off totally.

Fig.4 Firewall settings

4. Switch on the VM guests.
    i) verify the IP assigned to each guest VM and
    ii) try interconnecting to each of the VMs from each other,
    for example: have  guest 1 ping guest 2 and vice versa.

Fig.5    A cross ping amongst guest Windows OS 1( and 2( are successful hence eureka!

Monday, 6 January 2014

Java Object-Relational Mapping: Using JBoss' Hibernate SchemaExport tool to generate DDL from Entity Classes and Hibernate Mapping Files

Happy new Year, World J

1. Introduction 
I would take you through how to use Java classes (powered by JBoss' Hibernate) to build a database on the fly. I would present a subsequent post on how to use Netbeans to auto-generate Entity and Mapping Files from a datasource. 

There is every reason not to build your database manually for reasons below: 
    ·         many times you may have little information about your eventual deployment environment. 
    ·         truly service-oriented applications are database agnostic ("database unaware").

 Note: The Linux OS is the development environment I would execute command line scripts. 

I would use the Apache Derby/JavaDB lightweight database  for this tutorial. This database is empty awaiting the schema creation from using Hibernate tools.

These solutions fundamentally are not database nor OS dependent-it should work across all environments.

Fig. 1. The empty application database.

2. Process:  
To create a database straight from code using Hibernate ORM (simply, an implementation of JPA),
one needs 3  inputs i.e.
·             · Hibernate Configuration file i.e. Hibernate.cfg.xml
         ·  The Hibernate mapping files e.g. Tag.hbm.xml
         ·  Java  Entity class adorned with persistence annotations e.g.  @column
       See details below.
2.1. Hibernate Configuration file: This contains the database connection information and other hibernate specific parameters as sampled below:  

    Fig. 2. Content of Hibernate Configuration File.

      As you can observe, it also features the various HBM files (with full package location stated) we need a input. Also remember that these 2 properties must be defined to have, respectively, a persistence session  and schema manipulation enabled i.e. hibernate.current_session_context_class and  hibernate.hbm2ddl.auto.

   2.2. The Hibernate MappingHBM, files: These are needed to model the 4 database tables are below listed Posting.hbm.xml , Poster.hbm.xml, SysUser.hbm.xml and Tag.hbm.xml.

      Note: These were created manually, kindly see my related blog post on how to auto-create HBM files from Netbeans.

      Take a look at the content of the sampled Tag.hbm.xml file.  This content describes the signature of the database table to be created (with the complement of the Tag entity class in next section). 

Fig. 3. Content of a Hibernate mapping file modeled to mimic a Tag relational entity.

      Note: see other blog posts on how to auto-create the mapping file this time around using the Netbeans Hibernate wizard.

    2.3 Java Entity classes: A sampled entity, Tag.java, has been scripted below

Fig. 4. Content of a Hibernate Entity modeled to mimic the desired Tag Table.
     Now that we are done with the setup of the needed input files, we move on to the execution phase    

    3. Execution:
    With the various inputs done we would use Hibernate’s HBM2DDL tool to create the database using 1 of the options available below to run the SchemaExport.
1.       Invoking SchemaExport  from “Java Main class”
2.       Running from the command line

3.       Running the SchemaExport from ant build scripts.

     3.1. Invoking HBM2DDL SchemaExport  from the Java main class

      Create a class, e.g. SchemaCreator.Java,  to serve as the point of invocation of the schema creation or SQL DDL generation. This class makes use of the persistence session created by the HibernateUtil java class. The HibernateUtil class is a specialised class, made use of by the Hibernate ORM framework, which can contain configurations other than the settings in the hibernate.cfg.xml that the framework creates a persistence session from. The location of hibernate.cfg.xml is shown located at the root of your java package structure as shot below.  


    Fig. 5. Content of a HibernateUtil Java file.

   The HibernateUtil class is detailed below.

     Fig. 6. Content of a HibernateUtil Java file.

      The SessionFactory creation from a hibernate.cfg.xml file is performed by the statement below, as shown by the screenshot above.

      SessionFactory  sessionFactory = new Configuration().configure().buildSessionFactory();

      Note: To have the session created from a hibernate.cfg.xml located in another location as desired,  use below code.
     sessionFactory = new Configuration().configure("<CUSTOM_DIRECTORY>/hibernate.cfg.xml").buildSessionFactory();

     You can also add extra configuration parameters by using the instance methods addResource(~)  and addProperties(~) on the Configuration class.



Fig. 7. instance methods of the Configuration class.
        3.1.1 Invoking the schema creation from the main class SchemaCreator java class
     Fig. 8. Content of SchemaCreator invocation class

    Note: The above class principally uses two static objects - SessionFactory and Session.
    sessionFactory = HibernateUtil.getSessionFactory()//fetches session details from configs in HibernateUtil class
    session = sessionFactory.getCurrentSession(); //does loading of application session from Hibernate to be used for searches, updates and saves of entity objects in an application.
     Proceed to compile the above class and execute the class file to get the output below  

Fig. 9. Output of execution of SchemaCreator

    Perform a refresh of the data schema to see the new tables created in your database (remember, the connection parameters and more were stated in the hibernate.cfg.xml file).

Fig. 10. Database loaded with tables modeled from mapping files and entities

3.2.   Running  SchemaExport  from the command line
The syntax is stated below

3.2.1 This a java class invocation from the command line
      java -cp 'hibernate_classpath'  org.hibernate.tool.hbm2ddl.SchemaExport options  my_hbm_mapping_files                                                                                                 
     hibernate_classpath  = the runtime resources needed by hibernate i.e. dependent classes and jar files
     my_hbm_mapping_files = the location of all the *hbm.xml files
     options = this states the various command line parameter the SchemaExport tool  i.e. 

         do not output the script to stdout
         only drop the tables
         only create the tables
         do not export to the database
         output the ddl script to a file
         read Hibernate configuration from an XML file
         read database properties from a file(alt to --config)
         format the generated SQL nicely in the script
         set an end of line delimiter for the script

      3.2.2 Execution: 
     java -cp "$HBN_HOME/*"  org.hibernate.tool.hbm2ddl.SchemaExport  
     --create  $PROJECT_HOME/src/java/org/softlogic/blog/model/*.hbm.xml

Fig.11. Command line execution of the Hibernate tool’s SchemaExport as seen on Linux.

     3.3.   Running the SchemaExport from Apache ANT build scripts
     The Ant script, being the most powerful java code-administration tool, is essentially an xml file that contains the activities to be carried on a java project during build and deployment time . It is also sometimes used to "set up" the various dependencies a project have before the “first-run” of the project. 

     In our case, on first-run  we want our database created.
     Just embed the code below, which you can customise, in the build.xml file (or Netbeansbuild-impl.xml which would have its contents loaded into the build.xml file at compile time).The build script is located on the root of your project folder as captured below.

Fig.12. compilation build file as seen in the project folder

     Modify project build script by embedding another build target node for schemaexport as shown below
<target name="schemaexport">
         <taskdef name="schemaexport" classname="org.hibernate.tool.hbm2ddl.SchemaExportTask"
       <schemaexport config="hibernate.cfg.xml"
       <fileset dir="src">
            <include name="**/*.hbm.xml"/>

Content of build.xml file.

Fig.13. config snippet from ant build file

Perform a compilation of the project to have the schemaexport performed.Refresh the database to confirm new table additions


4.0 Legend
    JPA=Java Persistnce Application Programmer Interface 
    ORM=Object Relational Mapping
    DDL=Data Definition Language 

   5.0 References 

  6.0 Dont get yourself bothered to work hard in Life;work smart J

Thursday, 31 October 2013

JSF Primefaces Fileupload runtime dependency error

There are times when you create a Primefaces Java project on netbeans and upon first build you are greeted with an error whose stack trace indicates a missing runtime dependency. The error is, as snapped below, related to the Primefaces Fileupload widget component not having access to a class input on the classpath of your project.
The error is summarised as : Cause: Class 'org.primefaces.component.fileupload.FileUploadRenderer' is missing a runtime dependency.

click on snapshot to enlarge 

The primefaces widget has been designed by PrimeTechnology to depend on the Apache Commons classes which can be obtained from the Apache Commons download page as an archive.

Extract and drop the commons-fileupload-1.3.jar file in your project library folder or along your project classpath and recompile project.

In netbeans IDE, just add the jar file to the library of your project as shown below.

Now rebuild your project and EUREKA.

Friday, 2 August 2013

Movement of an SQL Server Temp Database

The tempdb, being a system database, is used by SQLServer to store internal objects such as the intermediate results of a query. The data pages of the tempdb are moved to and fro disk to as its being accessed by SQLServer hence should be placed on a drive which yields a good I/O speed.
As a system database, any activity to be carried out must be a Microsoft-recommended
technical line of action - detailed below.

Reason: Usually, the biggest reason that triggers the need for the movement of the tempdb is issue of limited disk space which in the short term can be fixed by performing a restart of the database instance to reclaim temp space. 

The tempdb size grows with verbose resultsets queried from the database.It also increases when sorting is carried on a user database and also when there are open transactions. Running the health-check operation DBCC Checkdb can also balloon the tempdb when it runs too long. 

1. The new location should be accessible i.e. writable.
2. The windows/SQLServer profile should have the privilege to update database file attributes.
3. The database should be running in a full and good health

1.   Determine the logical file names(data and log file) of the tempdb database and their current location on the disk.Find below the sql script and the logical names of the data and log files

SELECT name, physical_name AS   Current_Location FROM sys.master_files WHERE database_id = DB_ID(N'tempdb');

2.  Modify the location of each file (data and log files) by using ALTER DATABASE command on the master database.

USE master;

MODIFY FILE (NAME = tempdev, FILENAME = 'E:\users\annang\db_bag\tempdb.mdf');
MODIFY FILE (NAME = templog, FILENAME = 'E:\users\annang\db_bag\templog.ldf');


3. Now, perform the activities below:
    a.      Move the physical data and log files to new location.

    b.      Stop and start the instance of SQLServer.

    c.       Verify availability of SQLServer 

Friday, 19 July 2013

Oracle XML: When an Oracle Database houses an XML guest

...Oracle hooks-up with XML

XML documents apart from being a datasource also serves as a structured document with a definable schema. XML, as a standard, contains textual data which can be stored in an oracle database. XML data is usually described as a Character Large Object(CLOB) since it hosts several bytes of string data in a single document or data structure.
The fact that XML is a specialised data and datasource makes it’s storage in an Oracle database demand a specific approach to  its data storage and retrieval.
An Oracle database stores XML data in either a table or a column with a specific datatype – XMLTYPE.  Such data when stored are accessed using PL/SQL functions (which are an extension of custom XMLTYPE functions and operations) and XPATH constructs.
XPATH, the XML Path Language, is a query language for selecting nodes from XML data.
Below are sampled steps used to store and retrieve XML data in an Oracle database.

1. We create a table to store all ORDER xml content within a column of a database table.
FILEID varchar2 (12),
FILENAME varchar2 (64),

2. Creation of a CLOB variable to temporary host xml data

<?xml version="1.0"?>
 <CustomerName>Joyce Appiah Ofori</CustomerName>
         <ItemName>Sephora Beauty Range</ItemName>
         <ItemName>Mary Kay Cosmetics</ItemName>
         <Quantity unit="12">3</Quantity>
         <ItemName>L’eggs Stockings</ItemName>

3. Storage of XML in database table within a transaction to guarantee a full export or none

 4. Fetching data from the XML table using XMLTYPE functions of PL/SQL
Example 1. Using SQL function ‘extract’  and ‘extractValue’ to display all ItemId child elements of the Order parent as a varchar2 string type.
SELECT extract (XML_DOCUMENT, '/Order/Item/ItemId /text()').getStringVal()  "ITEM_IDs"  FROM  ORDER_XML_DOC_TBL
or simply
SELECT extract (XML_DOCUMENT, '/Order//ItemId /text()').getStringVal()  "ITEM_IDs"  FROM ORDER_XML_DOC_TBL
 or using extractValue function (which needs no data conversion) with an ORDER BY clause.
SELECT extractValue(XML_DOCUMENT, '/Order/Item/ItemId ')  "ITEM_IDs" FROM  ORDER_XML_DOC_TBL ORDER BY extractValue(XML_DOCUMENT, '/Order/Item/Quantity ')

Example 2. Using SQL function ‘extractValue’ to display a particular Item with Id= 579. Also introduced is the existsNode boolean function that returns 0 or 1 for either true or false.

SELECT extractValue(XML_DOCUMENT, '/Order/Item/ItemId ')  "ITEM_ID"  FROM ORDER_XML_DOC_TBL WHERE existsNode (XML_DOCUMENT, '/ Order //emp [ItemId ="579"]') =  1;

5. Reference
 Sample Order XML from Sybase Infocenter site.
 Oracle XML syntax from the Oracle Documentation site.