Next
Connectivity! - 07-Oct-2018 06:33 

Fritz Box

I't is required to configure static port forwarding rules in your router to enable the communication between the internet and your container. In my setup I have mapped the ports 38080, 31521 and 35500 to sligtly different ports lets say x8080, x1521 and x5500. These ports can be reached via the DynDNS address of my NAS

TNS_ADMIN

The lazy developer utilizes the tnsnames.ora file.

Environment variable

If not already available create an environment variable with the name TNS_ADMIN pointing at the path of your tnsnames.ora file.

tnsnames.ora

Add the following entries to your existing tnsnames.ora file or create a new one in the prefered location.

ORCLCDB =
 (DESCRIPTION =
   (ADDRESS_LIST =
     (ADDRESS = (PROTOCOL = TCP)(HOST = xxx.xxxxxxx.xxx)(PORT = X1521))
   )
 (CONNECT_DATA =
   (SERVICE_NAME = ORCLCDB)
 )
)

ORCLPDB1 =
 (DESCRIPTION =
   (ADDRESS_LIST =
     (ADDRESS = (PROTOCOL = TCP)(HOST = xxx.xxxxxxx.xxx)(PORT = X1521))
   )
 (CONNECT_DATA =
   (SERVICE_NAME = ORCLPDB1)
 )
)

Please adapt the host and port to your DynDNS address and external port.

SQL Developer

The actual SQL Developer version at point of writing is 18.2.
With an tnsnames.ora in place it is easy to establish connections to the container and plugable database.


Connecting the sys schema as SYSDBA allows me to apply DBA related tasks to the database(s).

Oracle Enterprise Manager

I can call the enterprise manager under the following link:

https://:5500/em

I want to connect to the container db ORCLCDB as sysdba first. Please not that the Containername to be used has to be CDB$ROOT. While trying to login into the CDB you will most likely face this dialog:


Finally I came across an explanation of the issue and even better the solution. We simply have to enable the related global port by executing the command:

exec dbms_xdb_config.setglobalportenabled(TRUE);

as user SYS as SYSDBA on the database ORCLCDB. I had to reload my firefox to be able to connect to the EM.
It is required to enable the global port separatly for every plugable db that needs to be accessed with the EM. In case of the ORCLPDB1, this means run the command as user SYS as SYDBA on the database ORCLPDB1. In case of plugable databases the containername in the EM is equal to the database name. Finally you will see this wonderfull dashboard.

sqlplus

Finally I would like to connect to my database(s) using sqlplus. I downloaded the oracle instantclient 18.2 from OTN. It is required to fetch the basic and the sqlplus package. I have unpacked both packages and added the location to my %PATH%.
I start sqlplus by using the windows powershell and the command

sqlplus /nolog

and login into the plugable database with the command

conn sys@orclpdb1 as sysdba



Please ignore the warning => SP2-0310: unable to open file "LOGIN.SQL"
DATABASE not so READY TO USE - 07-Oct-2018 05:00 

Lesson learned

In fact it turned out, that the database was not READY TO USE. Even if it was stated in the log. The listener simply refused the connection. So I waited until today and finally I was able to connect to the db. There was no further action required. Simply relax and wait.
DATABASE IS READY TO USE! - 06-Oct-2018 13:51 

Finally

After 2.5h finally the log stated

#########################
DATABASE IS READY TO USE!
######################### 

According to the log a conatiner database ORCLCDB and the first plugable database ORCLPDB1
have been created.

As creating an Oracle RDBMS container takes some time it might be usefull to export the container.



And to be on the save side I prefer to export the container and its content to my Diskstation.


In addition I create a Hyperbackup of my oradata files to have a backup of my database.

Container preparation

I know, that it is possible to use the command line, but in this case I prefer the wizard to create and launch the container. To start I go to the images section, highlight the oracle/database:18.3.0-ee image and press the launch button.

General Settings

Always choose a meaningful container name. I will use oracle-18c-database, as I will not have multiple databases in place at the same point in time. Then I click on the advanced settings button.


I will not limit the resources of my synology unless it is absolutely required. It is possible to run the container with the default settings, but I would like to make some adjustments.

Advanced settings

Not very advanced, but I would like enable the auto-restart.


Volume

Mounting volumes simply enables us to use directories placed outside the container inside the container. This makes it easier to add additional functionality to the container. I mount two volumes.
Both source directories are located in my docker home.


The directory oradata will contain the database files and the directory apex contains the latest Oracle Apex version 18.2 that will be installed in a later step.

Port settings

I like static port mappings so here we go. Using static port mappings will secure that all my port forwarding's are still valid after a restart of the container.


Pressing apply returns us to the General Settings screen. Everything is in place so we can hit the Next button.

Summary

The summary page highlights all details of the future container. The flag to run the container immediately after the wizard is finished is set already. I hit apply as everything looks fine.

Starting the container will take some time, because at this point in time the databases will be created and you will not see much progress. However you can switch to the containers page to monitor the progress.

Container

This is the place to see all available containers and there status (high level).
To get more details I highlight the container and hit the Details button.

Overview

Did I mention that creating a database will take a hell of time? If you start to get worried you can check two tabs of the container details page.

Log


The log tab will show you the actual log of your container and therefor the progress of the installation. However, sometimes it takes some time as well until the progress in shown in the log (NO PANIC!). It also shows you the initial password set for your DBA schema's. The log will remain at least for some days. In fact I have not seen any log entries disappearing.

Process


The process tab shows the actual processes running in your container. So everything is fine as long as figures are changing here, even if there is no progress in the log (NO PANIC!).

Terminal

Okay, there is a terminal tab. But I don't use it. You can give it a try and you will find out, why I prefer to use putty instead.

The run command

In case you still want to use the cmd or in case you don't use this on a synology.

docker run --name oracle-18c-database -v /volume4/docker/apex:/opt/oracle/apex:rw -v /volume4/docker/oradata:/opt/oracle/oradata:rw -p 31521:1521/tcp -p 35500:5500/tcp -p 38080:8080/tcp oracle/database:18.3.0-ee
SQL Connection via an Encrypted SSH Tunnel - 04-Oct-2018 07:50 - Insum Solutions

sqlnet over ssh Insum

Background

I recently did multiple upgrades on my Mac: the O/S to High Sierra, SQL Developer to 18.1, and then the JDK to bring it up to a level that SQL Dev requires. Naturally, I updated to the latest JDK, which is higher than SQL Dev 18.1 supports, but hey, why not?

Any time you change more than one thing, troubleshooting becomes complicated. This was no exception. The challenge started when I realized that by “upgrading” SQL Developer on my Mac, I actually overwrote the previous install, thereby deleting all of my connections and preferences. Sadly, this isn’t the first time I’ve done this. I don’t do a SQL Dev upgrade that often, so I forgot that it would overwrite everything. Sigh.

SSH Tunnel Connections

After increasing the SQL Dev font size (my eyes aren’t what they used to be) I endeavored to recreate my connections. All went well until I tried to set up the connections that require an SSH tunnel. This is a great feature. It allows you to encrypt SQL*Net (or any other kind of traffic) between your laptop and database (or other) server. I’ll spare you most of the trials I attempted working through the different upgrades and provide the spoiler: it was operator error. I didn’t set up my SSH tunnel in SQL Developer correctly.

I worked out how to do it years ago (using SSH without the assistance of SQL Developer). Worked it out again 18 months ago using SQL Developer. And worked it out again yesterday. Each time I went through the same issues and errors. Like I said, when you don’t do something frequently, you forget the details. This time, though, I’m blogging about it, including the error message, so that next year when I Google the error message, I’ll find my own blog post. It’s embarrassing how often that happens. Alas, it’s not just my eyes that are getting old.

Two Flavors

Let’s say you have a database and you want to encrypt the SQL*Net (or jdbc) traffic from the db server to the client (e.g. SQL Developer, SQLcl). This could be for a variety of reasons and there are a variety of solutions. My colleague and overall Oracle and Linux guru, Rich Soule, is about to publish a white paper with all the details of the full setup. I’ll link to that whitepaper when it becomes available. In the meantime, I’m going to present the “SSH Tunnel” solution, in two flavors:

1. Use SSH command line to establish the SSH Tunnel
2. Use SQL Developer to establish the SSH Tunnel

I’m also going to include the error message you may receive if you do it incorrectly.

In SQL Developer you may see the following:

An error was encountered performing the requested operation:
IO Error: Connection reset by peer, connect lapse 73 ms., Authentication lapse 0 ms.
Vendor code 17002

In SQLcl, you may see this:

Status : Failure -Test failed: IO Error: Connection reset by peer, connect lapse 3 ms., Authentication lapse 0 ms.

In both cases, you have probably fallen victim to thinking you have followed the SQL Developer help or the many blog posts about how to do this. Unfortunately, most of those, including the help, look at a simple case with only one or two machines involved. In my case, I always have at least 3 and sometimes 4 or 5 machines involved.

Machines involved

The 4 most likely machines involved are

1. Laptop with SQLDev or SQLcl (myLaptop or “localhost”), potentially outside the firewall.
2. Firewall (insum.ca) which is port forwarding (potentially from some random port, e.g. 44549 *) to an SSH server
3. The SSH server (myssh.insum.ca)
4. The db server (db.insum.ca)

You might have an additional firewall between 3 & 4. Aside from more things to configure and troubleshoot, though, it doesn’t change the steps required for this blog post.

Throughout this blog post, I will use the machine names above in the examples.

* If you have your firewall forward the standard SSH port (22) you will get a LOT of random hacking attempts. If you assign some non-standard port, those attacks will be greatly reduced. In most cases, the attacker will only have a chance of finding your random port by doing a port scan on your firewall and your firewall will shut it down before it makes it to the port you chose.

1 & 4 are required.
If there is a firewall, 3 & 4 are not visible to the outside world.
If any components are collapsed (or don’t exist), you just use the appropriate machine name. So, if you are using an SSH account on your database server (which isn’t that great of an idea, unless it’s all that you have) then you’ll just configure the SSH things on your equivalent of db.insum.ca instead of on myssh.insum.ca.

Configuring the Firewall

If you’re going to do this, you should do it right. That means configuring the firewall correctly, locking down the SSH server so that only the appropriate user or users have access, using key files, etc. That said, you’ll likely have a Linux admin and firewall admin doing at least some of the work. We’ll cover the Linux steps in another post. Let’s assume that you have been given the following:

1. The firewall server name (or the SSH server name if there is not a firewall involved): insum.ca
2. The port on insum.ca that is accepting SSH traffic: 44549
3. The db server name. This is the name that the SSH server can use to see the db: db.insum.ca, not that your laptop doesn’t need to be able to resolve this host.
4. The db service name: dev01.insum.ca
5. The db listener port: 1521
6. An OS user on the SSH server: sshuser
7. Either a key file for the OS user or the password for the OS user**. Key file name: my_key_file, Password for sshuser: myPassword

Throughout this blog post, I will use the values above in the examples.

** Of course, I recommend a passphrase protected key file for this. Describing how to create the key file, set up a no-login SSH account, configure the firewall, etc. is covered in Rich’s white paper. As all of this opens a significant hole in your firewall, I recommend reading that white paper and working with your firewall and Linux admins to do it correctly.

In this blog post, I’m not covering the server-side setups, but I will show how to set up SQL Dev (and SSH command line) to connect with either a key file or password.

Creating an SSH Tunnel Command Line

Creating an SSH tunnel simply establishes a secure channel between two machines and says where to route the traffic passing through the tunnel. In practice, this is accomplished by defining a port on the local machine to accept traffic and routing it via SSH to an SSH server on another machine. The SSH server then routes the traffic to a machine and port accessible to the SSH server. In our scenario, we also have a firewall that in the mix. The network traffic will ultimately look like this:

SQLcl connects to localHost on port 55444 > myLaptop forwards the traffic to the firewall (insum.ca) on port 44549 > the firewall forwards the traffic without inspection to the SSH server (myssh.insum.ca) on port 44549 > myssh.insum.ca forwards the traffic to db.insum.ca on port 1521

The firewall is pre-configured to port forward, so as a user, you don’t need to know the name of myssh.insum.ca. As mentioned above, though, you do need an ssh user on myssh.insum.ca. You can use the following command line to establish this tunnel:

ssh -L 55444:db.insum.ca:1521 -f -C -q -N -i my_key_file sshuser@insum.ca -p 44549

The Break down

-L   localPort:destination_machine:destination_port
        localPort               This can be any available port on localhost (your laptop). 
                                I simply chose 55444 and checked to see it was available.
        destination_machine     The database server
        destination_port        The database listener port

-f -C -q -N  combine to allow you to do a nologin connection, return to the command line without having to ctl-c,
             and to continue running after closing the command window. In order to stop your SSH tunnel,
             you will need to find the process and kill it using
             ps -ef | grep ssh
             Locate the process id and
             kill -9 [pid]

-i   key file name (if you are using a key file)

-p   the port your ssh server is using (or that your firewall is port forwarding)

If you don’t plan to use a key file, you can skip the -i portion:

ssh -L 55444:db.insum.ca:1521 -f -C -q -N sshuser@insum.ca -p 44549

You will be prompted for either the key file pass phrase or the SSH user’s password.

Voilà, Encryption

That’s it, now you have an SSH tunnel with encrypted traffic running between myLaptop, port 55444, and myssh.insum.ca and forwarding it unencrypted on port 1521 to db.insum.ca.

Keep in mind that anything can use this tunnel: SQLcl, SQL Developer, or literally any other process on your machine that might be aware of this tunnel.

To run SQLcl through the SSH tunnel, just “pretend” the database is running on localhost port 55444:

./sql anton@localhost:55444/dev01.insum.ca

Of course, once you have this SSH tunnel running, you can do the same thing in SQL Developer. Just create a connection using
host: localhost
port: 55444
service name: dev01.insum.ca

Keep in mind, exiting from SQLcl or SQL Developer does not close the SSH tunnel. To do this, you’ll need to kill the process as described above.

Creating an SSH Tunnel in SQL Developer

If you skipped “Creating an SSH Tunnel Command Line above” you may wish to read it just to get an understanding of what is happening. SQL Developer does essentially the same thing as the command line above, but it gives you a user interface to put in all the bits of information. The key is to put the info in the right spots. Rich’s white paper has lots of pictures of each of these steps. I’ll capture the minimum here.

1. Run SQL Developer
2. Click View > SSH
3. At the lower left, right click on SSH Hosts, then choose New SSH Host

This will create an SSH Tunnel definition equivalent to the command line in the previous section.

You can then create a database connection with connection type SSH.

When you connect to the database you will be prompted for either the key file pass phrase or the SSH user’s password. This will open the SSH tunnel. Alternatively, you can open the SSH tunnel by right clicking on the local port forward under SSH Hosts.

Once the tunnel is up and running, it is accessible to anything running on your laptop. Hence, you are able to use SQLcl through the SSH tunnel established within SQL Developer, much as you could use SQL Developer through the SSH command line tunnel.

Keep in mind, disconnecting from the database in SQL Developer does not close the SSH tunnel. To do this, you’ll need to right click on the local port forward connection (under SSH Hosts) and disconnect, or exit SQL Developer.

 

 

 

Photo credit: Yiran Ding, Unsplash

The post SQL Connection via an Encrypted SSH Tunnel appeared first on Insum.

APEX Plugins for Beginners - 03-Oct-2018 12:09 - Explorer

One of the great features of APEX is the possibility to import JavaScript plugins and css libraries into the application, just by referencing an external link or uploading a .js file.

There are several JavaScript plugins available online that can be used to easily add new functionality to you application. When you add a JavaScript library to a page or application, you usually do the following steps:

As an example, I will use a simple but very useful JavaScript plugin to create real-time masks (https://github.com/igorescobar/jQuery-Mask-Plugin)

1. Upload the files to the Static Application or Workspace files

2. Load these files on application or page level (I will do it page level)

3. Call the JavaScript function.

Next, I will show how to implement this same functionality as an APEX plugin. The advantages of having APEX plugins comes since no JavaScript knowledge is required to used it, and they can be easily exported and imported across different applications.

  1. Go to Shared Components/Plug-ins and click on create, and then create from scratch.
  2. Choose a name and internal name. The internal name is used internally by APEX for identifying the plug-in, it is not displayed.
  3. Choose the item type, it is where this plugin will be used.

Now, you must define a PL/SQL anonymous block that contains a function for rendering, validating, execution and Ajax callbacks for the plug-in. This can also be stored in the database.


/* When you create an item plugin, you must create the following interface. */
PROCEDURE render_mask_field (
p_item in apex_plugin.t_item,
p_plugin in apex_plugin.t_plugin,
p_param in apex_plugin.t_item_render_param,
p_result in out nocopy apex_plugin.t_item_render_result
)
IS
/* Here we define a plugin attribute used by the developer to set the item mask */
lv_format_mask varchar2(100) := p_item.attribute_01;

/* We need to call the following function to allow APEX to map the submitted value to the actual page item in session state.
This function returns the mapping name for your page item. If the HTML input element has multiple values then set p_is_multi_value to TRUE.*/
lv_item_name varchar2(1000) := apex_plugin.get_input_name_for_page_item(false);

Begin
/* This outputs the necessary HTML code to render a text field*/
sys.htp.p('<input id="'||p_item.name||'" class="text_field apex-item-text '||p_item.element_css_classes||'" name="'||lv_item_name||'" size="'||p_item.element_width||'" type="text" value="'||sys.htf.escape_sc(p_param.value)||'" placeholder="'||p_item.placeholder||'" />');

/* Here we call the javascript to set the mask to the item $('#P1_TOTAL').mask('000.000.000.000.000,00'); */
apex_javascript.add_onload_code('$("#'||p_item.name||'").mask("'||lv_format_mask||'")');

End render_mask_field;

4. Now we set the name of render function

5. Supported for: We must define where the plugin is supported, where it is available in the Builder, Page Items and/or Interactive Grid Columns. Also if the plug-in is displayed in the Builder as a supported component for desktop and/or mobile.

6. Standard Attributes: For Item type plug-ins, identify the attributes that apply to this plug-in:

7. Now we must save, so APEX will identify the plugin type and add some plugin options, we must create the format mask custom attribute.

8. JavaScript Files: basically, what are we going to do next is similar of what we do to use javaScript on a page level, upload and load the JavaScript File.

Finally, we can save and use the plugin. Just create a page item and change the item type to the plugin that we just created.

The post APEX Plugins for Beginners appeared first on Explorer | Award Winning UK Oracle Partner.

Connect to your synology using ssh

To build the image we have to connect to the synology via ssh. Root privileges would help. Latest at this point you need to enable the SSH service on your synology.

I use putty to connect to my synolgy via SSH. Credentials are simply the user name and password of my admin account. To obtain root privileges I use the command:

sudo -i

and key in the admins password again. Done.

Build the image

The script buildDockerImage.sh to create a single instance database image is located in the
folder (docker root on volume 4 on my synology):

/volume4/docker/docker-images-master/OracleDatabase/SingleInstance/dockerfiles

Moving to this folder makes life a lot easier.



Usage of the buildDockerImage.sh is explained in the readme. For now I would like to create an Oracle 18.3 enterprise edition image, so here we go:

 ./buildDockerImage.sh -v 18.3.0 -e

TROUBLE:  I don't like this kind of messages:

Docker version is below the minimum required version 17.09
Please upgrade your Docker installation to proceed.


So upgrading docker does not seem to be an option. For now I grep an earlier version of the build scripts. Maybe this is not the best possible solution, but it works.

SOME SMALLER OBSTACLES: Ok very small

-ash: ./buildDockerImage.sh: Permission denied

Nothing that couldn't be changed with a chmod +x buildDockerImage.sh

After some time (NOTE: The build process will download all other required images, so there might be some activity in your docker lately) the build will be ready to use.




It takes some more time if additional downloads are required!

I'll try to keep it short...

Before you can start you have to install the docker package on your synology (It works on my 918+, but I would strongly recommend some RAM).

Starting from scratch...

I want to use the newest oracle 18c database for my development/learning environment and i couldn't find any non commercial docker image. I don't want to use the commercial images, because in my understanding I will have to pay a license fee at some point in time.

So lets create a docker image from scratch. Well not 100% from scratch, because there is a git repo out there to support us. First of all we need to grep the build files(docker-images-master.zip) from the repo (Simply press the green button and download the Zip). Additionally we need the oracle installation binaries from OTN. Downloading of course requires a free OTN account.

So lets pick the Linux X86-64(LINUX.X64_180000_db_home.zip) version.

Preparing the synology...

Please activate the ssh service on your synology to enable external access(Control Panel / Terminal & SNMP) if not already done.



To utilize the predefined script buildDockerImage.sh we need a compatible folder structure accessible on the synology and to store the oracle installation binary inside the correct sub folder.

Lets keep it easy, I extract the docker-images-master.zip into the docker root(/volume4/docker) on my synology and copy the file LINUX.X64_180000_db_home.zip into the sub folder

/volume4/docker/docker-images-master/OracleDatabase/SingleInstance/dockerfiles/18.3.0

I use WinSCP as admin to transfer the files from windows to the synology. In my case the docker root is located on volume 4.


Acting when the user pastes an image - 02-Oct-2018 20:33 - Trent Schafer
I recently saw a question about how to act when someone paste's an image (from clipboard data). There is a plugin that can handle that - to store the image into a table, but I was interested from a purely academic perspective how this would work without using the plugin.

So, taking a look on MDN, I can see that there is a 'paste' event that you can listen to. See the documentation here: https://developer.mozilla.org/en-US/docs/Web/Events/paste.

Ok, so lets give this a whirl! I'm not going to cover loading this into a table/BLOB column, as it's been covered before. What we'll do, is create a region with an image element this will show our image as we paste - this can easily be extended to load it into a canvas, load into a table, whatever your hearts content!

Firstly, the region to show what the user pastes:


Next, we need a dynamic action. This will be a custom event ("paste") with the selector being the body.


Our True Action will be "Execute JavaScript Code" with code resembling:

function setImage(evt){
var dataUrl = evt.target.result;
$('#imageMagick').attr('src', dataUrl);
}

var pasteEvt = this.browserEvent;
var pasteItems =
(pasteEvt.clipboardData || pasteEvt.originalEvent.clipboardData).items;

for (index in pasteItems){

var item = pasteItems[index];
if (item.kind === "file"){
var pasteBlob = item.getAsFile();
if (pasteBlob.type === "image/png"){
var reader = new FileReader();
reader.onload = setImage;
reader.readAsDataURL(pasteBlob);
}
}
}


Now, when you paste, the image will get loaded into the region, with the img placeholder we set up. Neat! You may wish to cancel event propagation, but I'll leave that up to you to experiment with depending on your needs.

If you want to try out this specific example, I have an example:  https://apex.oracle.com/pls/apex/f?p=14882:32


Sources used in the making of this example:

- https://developer.mozilla.org/en-US/docs/Web/Events/paste
- https://stackoverflow.com/questions/6333814/how-does-the-paste-image-from-clipboard-functionality-work-in-gmail-and-google-c

Oracle Forms and Reports Modernization – Issues with Forms

This post kicks off a four-part series of blogs dedicated to Oracle Forms and Reports Modernization. If your company still uses Forms applications, is looking to modernize them, but have not yet found a realistic and reliable solution, then this blog series will be well worth your time.  

Forms and Reports Modernization is doable

Forms modernization doesn’t have to be a huge all-at-once undertaking. There are ways to start off small, with a modest budget and timetable and expand from there. Best of all, there is an Oracle technology available now that can make Forms modernization achievable within a reasonable timetable, no matter how many Forms applications you are running.   

 In this series we’ll identify the issues with Forms that are holding many companies back in their modernization plans. We’ll then introduce you to Oracle Application Express, a Rapid Application Development platform, we’ll have a detailed look at how Oracle Forms and Oracle APEX work and explain why APEX characteristics make it ideally suited to Oracle Forms and Reports Modernization. Finally, we’ll look at simple and advanced Forms/APEX integrations.  

 

Enjoy!  

 

A unique predicament

If your company currently uses Oracle Forms Applications, it is probably facing a unique predicament. On one hand, your Forms apps are likely well-maintained by experienced developers. Forms users are efficient with its features. Data-entry and maintenance is a snap even if the look and feel of its applications are from a bygone era. 

On the other hand, the rest of the company likely uses browser-based applications. Much more intuitive, they provide the flow and features that everyone expects from a browser-based experience. 

Your company may be able to put up with this incompatibility, but for how long? The growing under-the-hood difference between Forms and current development software is leading many companies to look for an alternative. To understand the cost of keeping forms applications as-is in your system, let’s look at some of the challenges it presents to developers. 

Challenges from a Development Standpoint

Forms challenges developers in several ways. When running it within a web-browser, developers must strictly control the browser version and parameters. Not an easy task as browsers are often updated. But, even when updated to its latest compatible version and aligned with a proper browser version, Forms’ performance often falls well below an average user’s expectations. Forms predates web-based technology and simply doesn’t have the page-to-page flow and logic of web applications.  When it comes to Web services or Mobile applications, Forms is that much more complicated to develop. 

 Forms often paired with its brother, a reporting tool called Reports. Compared to later Oracle reporting tools and other data analysis products, it is also getting old. In Reports, adding a view on your data basically means building a new report. This lack of dynamic reporting capability is now almost unheard of.   

 Client-Server Technology

Why so complex? Oracle Forms was originally based on client-server technology. Although Oracle continues Forms support, its client-server characteristics continue to be an obstacle to efficiency and cost savings. For example, Jinitiator is the Java virtual machine that allows Forms client applications to run inside web browsers.  Developers must install Jinitiator on each client computer. Likewise, each developer needs development software installed directly on his computer or alternatively, on a Virtual Machine (VM) somewhere in the cloud, before starting their development work. Then, for deployment, the developer must generate executable files and move them into a centralized production environment. Unlike most software today, it isn’t a matter of developing an application and then making it accessible or downloadable from the web. Developer instances are treated individually during upgrades or patches.

All this represents a big chunk of a developer’s time and increasing costs as the company grows.  And, we are not even talking about the complex software and infrastructure required on the web servers to run Oracle Forms on a company’s environments.

Holding back Oracle Forms and Reports Modernization

These significant technical limitations are part of the reason many companies using Forms applications are not connecting them to the web or to web services or even updating them (Yikes!). Without a doubt, they are missing out on what web technologies have to offer. 

Another important reason why companies hesitate to move away from Forms is the perception that the undertaking would be massive, drawn-out and expensive. As mentioned earlier, some companies have hundreds of Forms, each with their own particularities. To open and rewrite every one of them would surely be too difficult.  

PL/SQL vs Java

This perception may have a lot to do with Oracle’s move from PL/SQL-based programming (which is at the heart of Forms) and Java-based programming. Java object-based programming was brought in to move towards connection to and use of the world wide web for Oracle products in the early 2000s. The move was judicious but Java-based development technology is recognized as complex, particularly for PL/SQL Forms developers. In fact, it is another paradigm altogether. 

ADF

Over the years, Oracle has proposed a couple of Java-based replacement solutions for Forms users, including a development platform called ADF.  It didn’t catch on too well with Forms developers. For a more detailed explanation, see our blog called “Simplified, Cost Effective Development In EBS – Think Oracle APEX”  

Designer 2000

But Java wasn’t the only complication in the Forms story. Forms and Reports were also included in Oracle’s “CASE tool” based information system generator called Oracle Designer 2000. Designer is a top to bottom business system generation tool. It includes business process modelling, systems analysis modelling, systems design and application development. Forms development in this system contains automation. The resulting Forms applications have several times more code than a manually generated Forms application. What’s more, changing a Form produced with this system means going back to the very beginning of the modelling process. Not many developers still use it.      

Web-based development eventually superceded systems like this and Designer 2000 is now de-supported by Oracle. Unfortunately, it leaves companies that developed Forms this way with the additional burden of dealing with that all that extra code if they wish to modernize or migrate away from their Forms applications.  

 APEX for Oracle Forms and Reports Modernization 

Businesses need to know there’s a solution that addresses all these issues. Oracle Application Express (APEX), a Rapid Application Development platform, is firmly rooted in web technology and yet shares many similarities with Forms, not the least of which is that it is PL/SQL based.  

 Today, thanks to a well-thought-out development plan that includes extensive web-service support and ease of use, as well as continuing (I would even say growing) Oracle support, APEX is now the very best alternative for Oracle Forms and Reports Modernization. 

 In my next instalment, I’ll get into the details about the surprising parallels between Forms and Oracle APEX. 

 

See also

Oracle Forms Conversion to Oracle APEX – Where to Start

The Oracle Forms Modernization Journey

How You Know It’s Time To Modernize Your Oracle Forms & Reports App

Debating Whether to Keep Oracle Forms or to Migrate From It?

 

The post Oracle Forms and Reports Modernization – Forms challenges appeared first on Insum.

Next