Category: Software

Now as a Sys Admin I always am installing VM’s, testing OS’s hardening and such. One issue I always find myself in is when I have done a fresh OS install I need to get Perl up to date and install Modules that are needed to run some of my scripts. Now I always find myself in CPAN installing modules and finding out that they are failing to Make, and after banging my head into a wall for a while, I came across this fix.

$ sudo apt-get build-essential autoconf automake libtool gdb

This installs all the necessary packages to make CPAN modules. Once you have these installed, open CPAN and go ahead and install all the modules you need to. Oh and this works on Deb. Ubuntu releases of Linux, so if you need to install these for Red Hat you can use Yum and find the compatible packages to help with running Make on CPAN. Simple and quick as that.


Anyone who knows me probably knows this is one of my biggest pet peaves. I see this happening everyday, but from a large site like LinkedIn.. shame on you. As all of you probably know by now, a post was made to a Russian site of more then 6.5 million passwords from LinkedIn attempting to crack the SHA-1 hashed passwords.

Now for all of you who have gotten into encryption / decryption of sensitive data, I am sure you understand the problem here with just hashing a password with SHA-1 encryption without salting. If a hacker is able to determine the algorithm used to hash the password, then the rest is fairly easy. The simpler passwords can be uncovered fairly quickly, leaving the more difficult ones which when broadcasted to a hacker community, will take no time to be uncovered as well.

Salting a password is a process of adding in random bits into a hashed password. This makes it much more difficult for an attacker to decrypt because they would need to not only be able to hash their dictionary passwords, but then they would have to try each of those passwords with a variations of salts. This can fill up their data and computing power quite fast creating a very expensive and difficult task for a hacker. They would need a large budget along with tons of storage space and computing power to be able to break down a salted and encrypted password. Where as with just a SHA-1 encrypted password, an attacker would only have to Hash each password trial once and compare it to the value of the encrypted password they have. Once the value is not matched they can just move on to the next dictionary hashed value instead of salting variations of the same password.

So as you can see that the time and expense to decrypt salted passwords can be astronomical and even a deterrent for a hacker attempting to grab hold of your data. Now with all the problems these days with information being leaked or stolen, you would think that a site like LinkedIn would have been more mindful of the world we live in and not assume that the password obscurity they are using is “Good Enough”. If in fact the hackers have usernames that are tied to these passwords, then the truth of it is most people re use their usernames and passwords. So if a hacker grabs your information for one site, then they could potential try to use that same combination on other sites you may visit. Thinking that the breach that has happened to LinkedIn is focused solely on that site, is ignorant. Don’t fool yourselves, change your passwords everywhere you use it.

My recommendation to you is to alter your passwords. Try not to reuse a password many times. Come up with a methodology of using your passwords so that if you have to reuse them then you can derive a system of which passwords you use where. This will make it more difficult for someone to attempt to reuse your password to gain access to more of your personal information.

For all of you that enjoy working in Linux you may have run across a framework called Mason. Mason is a framework much like catalyst however I think it is much easier to use and less convoluted. It allows you to create objects to get a set data for a webpage. Now I know Perl has its drawbacks like any other language out there. What I do like about it though is that there is a vast library of modules out there that will do just about anything you want it to do.

With the addition of Mason framework, this allows you to create a more object-oriented site as well as create handlers to do some of the heavy lifting and leave Apache to do the HTML rendering. One thing I have used in conjunction with the Mason module, is a module called prototype. This allows you to add Ajax to your web page which gives you a rich user experience. Well I digress.

One thing i wanted to talk about here was how to add the handler to your apache configuration. This is fairly straight forward as any cgi handler is. One thing to make sure of is to have a directory (usually called data) for the caching of the objects to be placed. Another directory you would need to make sure of is to assign the component root so that Mason knows where to start looking for components when called.

Now one trick I found was that I usually create a sub folder in the web root for all my components to go to, but when I use this directory as my component root my AutoHandler does not work and assist with processing the pages correctly. So what is a AutoHandler? it is a page much like Application.cfm or .cfc is for ColdFusion to instantiate your application and set global, session and other variables. The AutoHandler allows you to set up your database connection object along with setup any processing that you would need to be done each time a page is called within the application.

So being that the AutoHandler needs to be run, one thing I have done was to set my component root equal to the root of my application. Now I believe you can add the AutoHandler to apache as the primary cgi handler much like you would add perl as a handler of .pl files. However for the sake of testing and flexibility I found that just using your root as the component root, works just as easy. Now I still use a subfolder to contain all my components, I just need to make sure in the path defined for the component, that I specify the folder it is in. not such a big deal to me.

Other than that, all you need is to set your PerlHandler equal to HTML::Mason::ApacheHandler and possibly use PerlAddVar to set global variables to be persistent. Below is a sample setup I have made for you to see a possible scenario of setting up Mason as the Perl web framework.

Well i must say, so far i am impressed with ColdFusion 10. They have really done a large overhaul to its base along with many new features. One thing i am impressed with is that adobe has replaced the older antiquated JRun subsystem that CF use to run on with Tomcat. Tomcat is notably more enhanced then JRun when it comes to an application built on Java. The draw back is that the structure of the new CF is very different from the older versions, especially when it comes to multi instance applications.

Adobe has tried to keep things as similar as possible, but one thing i noticed is that when installing a multi instance server, you no longer have that option on the initial install. You will have the option to install CF 10 initially, then if your license supports multi instance, you will then have the option afterwards to install another instance. Now onto the part that i have been heavily interested in.

One thing that was an enormous draw back on earlier versions of CF was that when you installed multiple instances for separate applications, you had 1 JVM config file that was shared between all instances. Now what does this mean?. Well it means that any memory adjustments or class file inclusions you made, applied to all instances. So this was not truly segmented if they all shared 1 file. I always believed that multi instance applications should be completely autonomous, and nothing should affect the other instances. Else this really defeats the purpose of having separate instances. So on we go, in older versions it was not impossible to have separate JVM config files for each instance, it was just a pain in the rear to remove the service and re install it while assigning the service to a new JVM config. CF 10 has inherited this idea and has truly segmented instances from one another. It was very confusing to me because i was use to all instances sharing the same bin and lib folders in the root. Well in CF10 each instance is listed in the root folder of CF10 as it’s own folder and under that having it’s own JVM and file structure. This makes each instance totally autonomous. Awesome!.

Another advancement has been to their schedule task manager. Anyone who know me, knows this drove me nuts. Previously when tasks failed, you would never know unless you trapped for the failure in your task. And the logging for these tasks were vague at best. Well they have completely redone this aspect and given Schedule Task Manager a true face lift with some pretty cool features. So i encourage all you CF geeks out there, give the below link a look and leave your comments on what you think of the new and improved version of CF. Happy coding!!.


%d bloggers like this: