[FSF Associate Member] View Annang Sowah's profile on LinkedIn

Thursday, 6 November 2014

[SOLVED] Mount and make a Linux Root Partition/Filesystem writable with EASE in seconds.

Pheeeeew! what a relief I have now :(


Background: This evening I was installing the new Android Studio IDE which required my setting of some JAVA OS-level environmental variables i.e. JAVA_HOME. This I did in the bashrc file in my user home directory and later realized that I messed up the very important bashrc file which was needed for login initializations  on my beloved Ubuntu desktop pc.
I hence couldn't log into my computer and decided to share how I fixed it in simple steps!

Recommendation: The solution below helps you have access to the file system to fix issues related to fstab, boot partition mishaps, GUI crash fixes etc.


1. From my Linux box I logged to the recovery mode(admin console) which was   successful.

2. I had the challenge of writing to or editing my "embattled" bashrc file because the whole file system was in a read-only mode - even with my logged-on administrator privileges.

3. I had to make my root file system writable by remounting it with the appropriate flags as snapped below:




mount -o remount,rw /                                                                        

4. With this done, I proceeded to undo whatever changes I made to my system
    earlier i.e. to delete a line I added to my bashrc file which triggered the mess.
    I had to edit the bashrc file as a privileged user using pico a special text
    editor on linux. You can use other great text editors if available on-system
    e.g. vi,vim, nano etc.
 





sudo pico /home/ ...user... /.bashrc                                                                     
or  just
pico /home/..user.../.bashrc                                                                                 



5. I now proceeded to fix the wrong entries I made in the bashrc file opened below












6. I saved my entries and got to the graphical login console by starting the xwindow linux GUI .




startx

7. That's all, best of luck :)

Friday, 21 March 2014

[SOLVED] NTFS Disk Mount Issues on Linux on a "Dual-Booted" Computer.

Good day, Fellas!

Background: I made an installation of Windows Server 2012 on an NTFS partition I originally allocated from an existing Ubuntu Linux 13.10 disk (dual-booting means getting a computer host 2 Operating Systems).
Upon accessing the NTFS disk from  the Ubuntu Linux OS I had an error message that implied that the disk has an unclean file system or is in an unsafe state.Below is the error thrown when the host Ubuntu Linux OS attempts to mount the NTFS partition.
















Fig.1  Error encountered whiles accessing NTFS partition

Cause: This is caused by a recent tweak of the Microsoft OS(Windows 8, Windows Server 2012) simply called Fast-Startup ( which involved the saving of device(RAM, CPU, Disk) boot metadata(e.g. registry boot parameters) on files during shutdown, to be accessed at the next startup to shorten boot-time) which puts a "lock" on the NTFS partitioning hence posing a difficulty when opening disk.

Solution: 
FROM WINDOWS:  From your windows OS environment,  access the Power Options under control panel. Move down to the shutdown settings panel where you can disable the Fast-Startup by dis-selecting the "Turn on Fast startup".

FROM LINUX: We would use the software package called ntfs-3g which has a powerful binary called ntfsfix. Lets first install the package using the super-user priviledge and command below.

# sudo apt-get install ntfs-3g

Lets next confirm if the installation has been successful by checking the man pages(i.e. utility  manual pages) for ntfsfix 

# man ntfsfix













Fig.2  NTFSFIX utility manual on Linux.

Now, lets proceed to run the utility to fix the issue snapped above by simply :) running

# sudo ntfsfix /dev/sda3
Where sda3 is the ntfs partition of the disk that you have difficulties opening.
















Fig.3  NTFSFIX utility run  on Linux without errors.

Verify if the partition is accessible - as I have done below.










Fig.4  Partition is now fully accessible.


Issue solved, hopefully :)



Tuesday, 18 February 2014

[SOLVED] How to inter-connect Oracle Virtualbox Virtual Machine Guest Computers

Many times there is the need to create a list of virtual servers with a goal of interconnecting each guest VM to other peer VMs on the same Virtualbox instance.

Background: I was in the process of installing 3 virtual servers on an Ubuntu Linux system viz: Hyperion Oracle Database(OLTP) server, Essbase OLAP server and an Oracle Enterprise Performance Management (EPM) server as middleware.
There is the technical requirement to interconnect these Windows Server OS servers in an enterprise environment using the "all-powerful" Virtualbox virtualisation tool.











Fig.1 Virtual Guests Servers on VirtualBox instance needing inter-connectivity.


Solution: 
1. Configure the network adapter of the guest VMs to be attached to an "Internal Network" on the network section of "Adapter 1".






















Fig.2 Network adapter settings


2. Set up Virtualbox's DHCP server and add a connectivity IP range for the use of guest virtual machines. This configuration simulates a virtual router environment.

 This is done by running the command below which pegs the lower IP assignable to 10.13.13.101 and the upper to 10.13.13.254.
It also assigns the virtual "DHCP server" the IP 10.13.13.100 as scripted below for your use.



VBoxManage dhcpserver add --netname intnet                      
--ip 10.13.13.100 --netmask 255.255.255.0                       
--lowerip 10.13.13.101 --upperip 10.13.13.254 --enable          











Fig.3 command run to set up dhcpserver and accompanying settings



3.Put the firewall of the guest VM Operating Systems off totally.



Fig.4 Firewall settings



4. Switch on the VM guests.
    i) verify the IP assigned to each guest VM and
    ii) try interconnecting to each of the VMs from each other,
    for example: have  guest 1 ping guest 2 and vice versa.



Fig.5    A cross ping amongst guest Windows OS 1(10.13.13.101) and 2(10.13.13.102) are successful hence eureka!


Monday, 6 January 2014

Java Object-Relational Mapping: Using JBoss' Hibernate SchemaExport tool to generate DDL from Entity Classes and Hibernate Mapping Files

Happy new Year, World J

1. Introduction 
I would take you through how to use Java classes (powered by JBoss' Hibernate) to build a database on the fly. I would present a subsequent post on how to use Netbeans to auto-generate Entity and Mapping Files from a datasource. 

There is every reason not to build your database manually for reasons below: 
    ·         many times you may have little information about your eventual deployment environment. 
    ·         truly service-oriented applications are database agnostic ("database unaware").


 Note: The Linux OS is the development environment I would execute command line scripts. 

I would use the Apache Derby/JavaDB lightweight database  for this tutorial. This database is empty awaiting the schema creation from using Hibernate tools.

These solutions fundamentally are not database nor OS dependent-it should work across all environments.







Fig. 1. The empty application database.


2. Process:  
To create a database straight from code using Hibernate ORM (simply, an implementation of JPA),
one needs 3  inputs i.e.
·             · Hibernate Configuration file i.e. Hibernate.cfg.xml
         ·  The Hibernate mapping files e.g. Tag.hbm.xml
         ·  Java  Entity class adorned with persistence annotations e.g.  @column
       See details below.
       
      Details.
2.1. Hibernate Configuration file: This contains the database connection information and other hibernate specific parameters as sampled below:  
       

    Fig. 2. Content of Hibernate Configuration File.



      As you can observe, it also features the various HBM files (with full package location stated) we need a input. Also remember that these 2 properties must be defined to have, respectively, a persistence session  and schema manipulation enabled i.e. hibernate.current_session_context_class and  hibernate.hbm2ddl.auto.

   2.2. The Hibernate MappingHBM, files: These are needed to model the 4 database tables are below listed Posting.hbm.xml , Poster.hbm.xml, SysUser.hbm.xml and Tag.hbm.xml.

      Note: These were created manually, kindly see my related blog post on how to auto-create HBM files from Netbeans.


      Take a look at the content of the sampled Tag.hbm.xml file.  This content describes the signature of the database table to be created (with the complement of the Tag entity class in next section). 




















Fig. 3. Content of a Hibernate mapping file modeled to mimic a Tag relational entity.

      Note: see other blog posts on how to auto-create the mapping file this time around using the Netbeans Hibernate wizard.


    2.3 Java Entity classes: A sampled entity, Tag.java, has been scripted below
        



































Fig. 4. Content of a Hibernate Entity modeled to mimic the desired Tag Table.
 
     Now that we are done with the setup of the needed input files, we move on to the execution phase    

                 
    3. Execution:
    With the various inputs done we would use Hibernate’s HBM2DDL tool to create the database using 1 of the options available below to run the SchemaExport.
1.       Invoking SchemaExport  from “Java Main class”
2.       Running from the command line

3.       Running the SchemaExport from ant build scripts.


     3.1. Invoking HBM2DDL SchemaExport  from the Java main class

      Create a class, e.g. SchemaCreator.Java,  to serve as the point of invocation of the schema creation or SQL DDL generation. This class makes use of the persistence session created by the HibernateUtil java class. The HibernateUtil class is a specialised class, made use of by the Hibernate ORM framework, which can contain configurations other than the settings in the hibernate.cfg.xml that the framework creates a persistence session from. The location of hibernate.cfg.xml is shown located at the root of your java package structure as shot below.  




    










    Fig. 5. Content of a HibernateUtil Java file.

   The HibernateUtil class is detailed below.

     Fig. 6. Content of a HibernateUtil Java file.

      The SessionFactory creation from a hibernate.cfg.xml file is performed by the statement below, as shown by the screenshot above.

      SessionFactory  sessionFactory = new Configuration().configure().buildSessionFactory();

      Note: To have the session created from a hibernate.cfg.xml located in another location as desired,  use below code.
     sessionFactory = new Configuration().configure("<CUSTOM_DIRECTORY>/hibernate.cfg.xml").buildSessionFactory();

     You can also add extra configuration parameters by using the instance methods addResource(~)  and addProperties(~) on the Configuration class.

     


 




Fig. 7. instance methods of the Configuration class.
    
        3.1.1 Invoking the schema creation from the main class SchemaCreator java class
     Fig. 8. Content of SchemaCreator invocation class

    Note: The above class principally uses two static objects - SessionFactory and Session.
    sessionFactory = HibernateUtil.getSessionFactory()//fetches session details from configs in HibernateUtil class
    session = sessionFactory.getCurrentSession(); //does loading of application session from Hibernate to be used for searches, updates and saves of entity objects in an application.
  
     Proceed to compile the above class and execute the class file to get the output below  
      













Fig. 9. Output of execution of SchemaCreator

    Perform a refresh of the data schema to see the new tables created in your database (remember, the connection parameters and more were stated in the hibernate.cfg.xml file).
                                                                       











Fig. 10. Database loaded with tables modeled from mapping files and entities



3.2.   Running  SchemaExport  from the command line
The syntax is stated below

3.2.1 This a java class invocation from the command line
      java -cp 'hibernate_classpath'  org.hibernate.tool.hbm2ddl.SchemaExport options  my_hbm_mapping_files                                                                                                 
                      
     hibernate_classpath  = the runtime resources needed by hibernate i.e. dependent classes and jar files
     my_hbm_mapping_files = the location of all the *hbm.xml files
     options = this states the various command line parameter the SchemaExport tool  i.e. 

      parameter
          purpose
        --quiet
         do not output the script to stdout
      --drop
         only drop the tables
      --create
         only create the tables
      --text
         do not export to the database
      --output=db_schema.sql
         output the ddl script to a file
      --config=hibernate.cfg.xml
         read Hibernate configuration from an XML file
      --properties=hibernate.properties
         read database properties from a file(alt to --config)
      --format
         format the generated SQL nicely in the script
      --delimiter=;
         set an end of line delimiter for the script


      3.2.2 Execution: 
     java -cp "$HBN_HOME/*"  org.hibernate.tool.hbm2ddl.SchemaExport  
     --config=$PROJECT_HOME/src/java/hibernate.cfg.xml  
     --create  $PROJECT_HOME/src/java/org/softlogic/blog/model/*.hbm.xml
   
      











Fig.11. Command line execution of the Hibernate tool’s SchemaExport as seen on Linux.
  

     3.3.   Running the SchemaExport from Apache ANT build scripts
     The Ant script, being the most powerful java code-administration tool, is essentially an xml file that contains the activities to be carried on a java project during build and deployment time . It is also sometimes used to "set up" the various dependencies a project have before the “first-run” of the project. 

     In our case, on first-run  we want our database created.
     
     Just embed the code below, which you can customise, in the build.xml file (or Netbeansbuild-impl.xml which would have its contents loaded into the build.xml file at compile time).The build script is located on the root of your project folder as captured below.
   
 
      




Fig.12. compilation build file as seen in the project folder

     Modify project build script by embedding another build target node for schemaexport as shown below
<target name="schemaexport">
         <taskdef name="schemaexport" classname="org.hibernate.tool.hbm2ddl.SchemaExportTask"
         classpathref="class.path"/>
   
       <schemaexport config="hibernate.cfg.xml"
        quiet="no"
        text="no"
        create="yes"
        delimiter=";">
       <fileset dir="src">
            <include name="**/*.hbm.xml"/>
       </fileset>
    </schemaexport>
</target>

Content of build.xml file.


Fig.13. config snippet from ant build file

Perform a compilation of the project to have the schemaexport performed.Refresh the database to confirm new table additions

    

















 


4.0 Legend
    JPA=Java Persistnce Application Programmer Interface 
    ORM=Object Relational Mapping
    DDL=Data Definition Language 


   5.0 References 

  6.0 Dont get yourself bothered to work hard in Life;work smart J




Thursday, 31 October 2013

JSF Primefaces Fileupload runtime dependency error

There are times when you create a Primefaces Java project on netbeans and upon first build you are greeted with an error whose stack trace indicates a missing runtime dependency. The error is, as snapped below, related to the Primefaces Fileupload widget component not having access to a class input on the classpath of your project.
The error is summarised as : Cause: Class 'org.primefaces.component.fileupload.FileUploadRenderer' is missing a runtime dependency.



click on snapshot to enlarge 

Solution:
The primefaces widget has been designed by PrimeTechnology to depend on the Apache Commons classes which can be obtained from the Apache Commons download page as an archive.



Extract and drop the commons-fileupload-1.3.jar file in your project library folder or along your project classpath and recompile project.

In netbeans IDE, just add the jar file to the library of your project as shown below.




Now rebuild your project and EUREKA.



Friday, 2 August 2013

Movement of an SQL Server Temp Database



INTRODUCTION
The tempdb, being a system database, is used by SQLServer to store internal objects such as the intermediate results of a query. The data pages of the tempdb are moved to and fro disk to as its being accessed by SQLServer hence should be placed on a drive which yields a good I/O speed.
As a system database, any activity to be carried out must be a Microsoft-recommended
technical line of action - detailed below.

Reason: Usually, the biggest reason that triggers the need for the movement of the tempdb is issue of limited disk space which in the short term can be fixed by performing a restart of the database instance to reclaim temp space. 

The tempdb size grows with verbose resultsets queried from the database.It also increases when sorting is carried on a user database and also when there are open transactions. Running the health-check operation DBCC Checkdb can also balloon the tempdb when it runs too long. 

PREREQUISITES
1. The new location should be accessible i.e. writable.
2. The windows/SQLServer profile should have the privilege to update database file attributes.
3. The database should be running in a full and good health

PROCESSES
1.   Determine the logical file names(data and log file) of the tempdb database and their current location on the disk.Find below the sql script and the logical names of the data and log files


SELECT name, physical_name AS   Current_Location FROM sys.master_files WHERE database_id = DB_ID(N'tempdb');
          
GO                                

2.  Modify the location of each file (data and log files) by using ALTER DATABASE command on the master database.

USE master;
GO

ALTER DATABASE tempdb
MODIFY FILE (NAME = tempdev, FILENAME = 'E:\users\annang\db_bag\tempdb.mdf');
GO
ALTER DATABASE tempdb
MODIFY FILE (NAME = templog, FILENAME = 'E:\users\annang\db_bag\templog.ldf');


GO

3. Now, perform the activities below:
    a.      Move the physical data and log files to new location.

    b.      Stop and start the instance of SQLServer.

    c.       Verify availability of SQLServer