Adding a Custom Domain from Namecheap to Microsoft Azure

To add a custom domain name to a website hosted on Microsoft Azure (like this website), you must first go into the Azure’s SCALE configuration page and make sure the web hosting plan mode is anything above Free mode. For me, I picked Shared mode.

Web hosting plan mode

You’ll then see a “Manage Domains” button at the bottom of Azure. Click that and you will see “THE IP ADDRESS TO USE WHEN YOU CONFIGURE A RECORDS”. Keep note of the IP address.

First, we have to go into Namecheap. Click on the domain you want to use, and click “All Host Records” on the left side under Host Management.

We need to create one “A (Address)” and three “CNAME (Alias)” entries. Create the entries as shown below:

HOST NAME     IP Address/URL                         Record Type
@             <the IP address from Azure>            A
www           <website>           CNAME
awverify      awverify.<website>  CNAME
awverify.www  awverify.<website>  CNAME

Save the settings and you should be done with Namecheap.

Namecheap host records

Now lastly, go back into Azure’s “Manage Custom Domains” and type in the domain name that you’ve just configured. The input box should show a green checkmark indicating that Azure was able to communicate with your domain name on Namecheap.

Azure manage custom domains

Click the checkmark and you can now access your Azure website with your custom domain!

Becoming a Software Engineer


I quit my IT job 2 weeks ago so that I could attend a coding boot camp called Hack Reactor. It is a 12 week course (6 days a week, 11 hours a day) that started today. Based on the previous graduations, 99% of the students found a software engineering job within 3 months with an average starting salary of $105,000, as of this blog post date.

I’m super excited and can’t wait to see how much I will grow and where I will be after I graduate.

Samsung TV Overscan HDMI Fix

I connect my hackintosh to my Samsung TV to watch movies, but there’s a problem where the TV cuts off part of the edges.


If I open up the OS X Displays Preferences, there’s a checkbox for “Overscan”. Unchecking it will show my entire desktop on the TV with nothing cut off, but now there’s a black border.

tv2-osx tv2-osxoverscanoff
So I’ve been using it like that for a few weeks now, but I decided to look for a fix to get rid of the black borders. I searched online and found this post from How to Disable overscan on a Samsung “SMART” tv – solved

What the user Sol did was changing the “Edit Name” for the HDMI connection to “PC”. So I tried it out.

tv3-hdmi tv3-edit tv3-pc
Success?? No, not yet. It only made the black border even bigger for me!


I opened up the OS X Displays Preferences again, and turned Overscan back On, and voila! My whole desktop can be seen now, with no black borders on the Samsung TV!


P.S. My Samsung TV is new, and I haven’t removed the plastics and energy guide sticker yet. :)

TMS Cannot Upload New Software (404)

Our Cisco TelePresence endpoints are currently on TC6.0.1, while the latest being TC7.1.3. So I opened up TMS > Systems > System Upgrade > Software Manager; this is where we can upload the latest files to TMS, so the endpoints can download the updates easily.

Now the problem is when you click “Upload New Software”, select the TC7.1.3 pkg file, and hit the Upload button.


Everything looks fine, right until the upload reaches 100%. At that point, it’ll show: “Server Error – 404 – File or directory not found.”


The reason is that TMS only allows files that are ~300MB to be uploaded, and the TC7.1.3 pkg file is ~320MB… just a little bit over that upload limit.

The solution: Remote Desktop into the TMS server and edit the Web.config file, located at “C:\Program Files (x86)\TANDBERG\TMS\wwwTMS\”.

Find this line:

<httpRuntime maxRequestLength="307200" executionTimeout="300" requestValidationMode="2.0" />

And replace it with this line:

<httpRuntime maxRequestLength="512000" executionTimeout="300" requestValidationMode="2.0" />

Then find this line:

<requestLimits maxAllowedContentLength="314572800" />

And replace it with this line:

<requestLimits maxAllowedContentLength="512572800" />

This will increase the upload file size limit to ~500MB. Now restart IIS and the new limit will be used.

Tip: Copy the Web.config file to the Desktop and edit that copy. When finished editing, save it, drag it back into the wwwTMS folder, and click Replace file.

Migrating TMS Agent to TMS Provisioning Extension

One of the things that must be done first before upgrading Cisco TelePresence Management Suite from v13 to v14, is to migrate TMS Agent to TMS Provisioning Extension. My company’s TMS was on v13.1, and the latest version was v14.3.2. After reading through the upgrade guides, I figured out that my steps had to be done in this order:

  1. Upgrade TMS from 13.1 to 13.2.x.
  2. Install TMSPE 1.0.
  3. Upgrade TMS from 13.2.x to 14.3.2.
  4. Upgrade TMSPE from 1.0 to 1.1.
  5. And finally upgrade TMSPE from 1.1 to 1.2.

This should have been pretty easy and straight forward, but we wanted to migrate TMS from its physical box to a virtual machine. So what I did was installed a fresh copy of TMS v13.1 on a brand new Windows Server 2008 R2. I did a SQL backup from the physical box and restored it to our SQL server that we wanted to use. Then I upgraded TMS v13.1 to v13.2.

This is the point where I thought all the data was restored, upgraded, and ready to be migrated. I installed TMSPE and ran the migration tool, but for some reason it said Migration Successful with 0 users migrated.


I was confused, so I had to do a bit of googling. Thankfully, I was able to find this post written by cfiestas on the Cisco forums: How to transfer OpenDS/Provisioning VCS data to TMS for recovery purposes. It turns out that the TMS Agent database isn’t stored in the SQL database; the TMS Agent data is stored in C:\Program Files\TANDBERG\TMS\Provisioning\OpenDS-2.0\db\userRoot. So I went to that path on the physical TMS box, and copied the folder over to the new virtualized TMS.


This time when I reran the TMSPE migration tool, 37 users were migrated successfully! Now I am able to upgrade TMS v13.2 to v14.3.2, and upgrade TMSPE to v1.2.


TMSPE Installation – SQL Authentication Error

Upgrading an old version of the Cisco TelePresence Management Suite (v13.1) to a newer version (v13.2+) needs TMS Agent to be migrated to TMS Provisioning Extension.

When running the TMSPE installer (TMSProvisioningExtension.exe), it will ask you for the SQL server that you want to use.


After entering the correct information, you may encounter this error message:

“A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: SQL Network Interfaces, error: 26 – Error Locating Server/Instance Specified)”


The solution is very simple; the installer needs to know what the port number of the SQL Server is. All you have to do is change <SQL Server Name> to <SQL Server Name>:<port number>! In my case, the SQL Server was using the default port (1433). So I just had to change “sql-06” to “sql-06:1433”.


Building a Hackintosh

I have a 13″ MacBook Pro with Retina display that I use as my main computer for everything I do at home, and I thought it’d be good enough for gaming. However, I started playing Counter-Strike: Global Offensive, and there is noticeable lag every now and then, especially in big maps that have a lot of objects to render. I had to lower the game resolution to somewhere around 720p, and only had the graphics settings on medium. Even then, I wasn’t happy with the low FPS I was getting.

This made me rethink about using a MacBook as my main machine. Since it was on my desk 99% of the time anyway, I should just use a desktop computer, right? I wanted a more powerful machine that could handle gaming, I still wanted to use Mac OS X, and I wanted it to be cheap (price wise). So what better thing to do than to build my own Hackintosh!?

The first thing I did was to head over to to check out this article: Building a CustoMac: Buyer’s Guide May 2014. That article provides a list of the easiest and best supported hardware options for installing Mac OS X.

The Parts

Like I mentioned above, I didn’t want to spend a ton of money on expensive parts, but still wanted it good for gaming. So the parts that I picked out are:

The total came out to be $1,259.55… which is a just little above what I paid for my MacBook.

The Installation

For the most part, it was pretty straightforward using this guide: UniBeast: Install OS X Mavericks on Any Supported Intel-based PC

To summarize my installation steps:

  1. Download Mavericks from the App Store on existing MacBook.
  2. Plug in a USB flash drive to make an OS X installer.
    • Make it 1 partition, set the option to Master Boot Record, format it to Mac OS Extended (Journaled).
    • Run Unibeast and let it put the Mavericks installer in the flash drive.
    • Copy MultiBeast into the finished flash drive.
  3. Turn on the newly built computer and change some of the UEFI settings.
    • Loaded Optimized Defaults.
    • Set X.M.P. Memory Profile to Profile1.
    • Change XHCI mode to Auto instead of Smart Auto.
    • Set EHCI to Enabled.
    • Set XHCI to Enabled.
  4. Boot up from the USB flash drive.
  5. Open Disk Utility and format the SSD.
    • Create 1 partition, Option: GUID Partition, Format: Mac OS Extended (Journaled).
  6. Install OS X Mavericks to the SSD.
  7. Now OS X should be running, open MultiBeast to install some drivers and tweaks to make the Hackintosh run better.

The End

This entire process took about 3.5 hours. Here’s the rough timeline of my steps:

  • I started building the computer around 5pm and finished building around 6:30pm.
  • The UniBeast part took ~20 minutes. I should’ve had this step running while I was building the computer.
  • Changing the UEFI settings took ~10 minutes. I was getting a kernel panic while booting into the installer, so I had to search online to see which settings worked.
  • Installing OS X took ~20 minutes.
  • MultiBeast took ~10 minutes. I looked online to see which checkboxes other people with a similar build had enabled.
  • I transferred Counter-Strike: Global Offensive files (~8GB) from my MacBook to my Hackintosh, which took ~20 minutes.
  • By 8:30pm, I got on Steam and was able to play Counter-Strike in 1080p on high graphics settings. And. It. Was. Smooth!

Overall, it was a pretty painless process and I am very happy with my new Hackintosh.


P.S. Although the Logitech C920 webcam captures crispy clear images, it has a TERRIBLE microphone… it makes me sound muffled and far away. I will definitely replace this webcam with one that has a better mic, which is what I really care about because I talk online with my friends in Mumble everyday.

Cisco VCS Insecure Password in Use

We had a Cisco TelePresence Video Communication Server on version X7.1. The newest version is X8.1.1, so my task was to upgrade it, which went very smoothly! The only warning alarm that popped up was this message: “Insecure password in use – The root user’s password is hashed using MD5, which is not secure enough”


The solution is quite simple — we just have to SSH into the VCS with the root user account and change the password. On Windows you can use Putty. Since I’m on a Mac, I can open up terminal and type: ssh root@<ip-address> Type in the current password to log in, then type passwd to create a new password, and you’re done!


The reason is that after version X7.2, Cisco changed the hash method from MD5 to SHA512. Administrator account passwords are automatically rehashed after the upgrade, but not the root account.


No more tickets now!

Scheduling PHP Cron Jobs with Parameters

I was making a PHP script for reading a resource account’s calendar from an Exchange server and then saving the data to a MySQL database. This script was going to be a cron job on the web server set to run every 2 minutes, and I was going to have multiple cron jobs for reading different resource account calendars.

Instead of having separate cron files for reading each account, I made one cron_rooms.php file that I could use with the HTTP $_GET variable. The PHP script worked great through the web browser, and the URL was something like this:[name]

To schedule the cron jobs, I added them easily through the webmin web control panel in the browser.


To do it through command line in SSH, I’d run this command: crontab -u <username> -e

Then I’d enter the lines below for updating my 5 rooms every 2 minutes.

0,2,4,6,8,10,12,14,16,18,20,22,24,26,28,30,32,34,36,38,40,42,44,46,48,50,52,54,56,58 * * * * /usr/bin/php -q /var/www/cron_rooms.php?room=dsv #Update DSV
0,2,4,6,8,10,12,14,16,18,20,22,24,26,28,30,32,34,36,38,40,42,44,46,48,50,52,54,56,58 * * * * /usr/bin/php -q /var/www/cron_rooms.php?room=9991 #Update 9991
0,2,4,6,8,10,12,14,16,18,20,22,24,26,28,30,32,34,36,38,40,42,44,46,48,50,52,54,56,58 * * * * /usr/bin/php -q /var/www/cron_rooms.php?room=9992 #Update 9992
0,2,4,6,8,10,12,14,16,18,20,22,24,26,28,30,32,34,36,38,40,42,44,46,48,50,52,54,56,58 * * * * /usr/bin/php -q /var/www/cron_rooms.php?room=9993 #Update 9993
0,2,4,6,8,10,12,14,16,18,20,22,24,26,28,30,32,34,36,38,40,42,44,46,48,50,52,54,56,58 * * * * /usr/bin/php -q /var/www/cron_rooms.php?room=9994 #Update 9994

So anyway, all is good and ready to go, right?… Wrong!

When the cron job runs, it’s giving me these errors:

Could not open input file: /var/www/cron_rooms.php?room=dsv
Could not open input file: /var/www/cron_rooms.php?room=9991
Could not open input file: /var/www/cron_rooms.php?room=9992
Could not open input file: /var/www/cron_rooms.php?room=9993
Could not open input file: /var/www/cron_rooms.php?room=9994

Why was the script working in the browser, but not in cron? Simple, the script I made was using HTTP GET requests that works for URL’s. When running in command line mode (which is what cron is using) I need to pass the parameters using arguments.

Solution: I replaced the $_GET[‘room’] variable in my script with $argv[1]. Then change each line in cron from this:

/usr/bin/php -q /var/www/cron_rooms.php?room=[name]

to this:

/usr/bin/php -q /var/www/cron_rooms.php [name]

In command line mode, $argv[0] will always contain the path of the script: “/var/www/cron_rooms.php”

And $argv[1] would be the first argument after the path: [name]

If we had more arguments, we can access them with $argv[2], $argv[3], etc.

Video Conferencing Endpoint Not Saving CDR in TMS

There was this one Tandberg VC endpoint, that we call SGP, on our network that would not save any call detail records into our TelePresence Management Suite.


This particular endpoint was set up with a public IP address facing the public internet, so we thought maybe there were some firewall rules blocking the CDR being sent to TMS. Before asking the network engineers to look at the firewall and network rules, I tried playing with every single setting on the endpoint that I could think of. Nothing that I changed would get the CDR to save.

One day, while I was looking at the settings for a different device in TMS, I noticed a tab called Connection. In that area, there was a “System Connectivity” dropdown menu with “Reachable on LAN” chosen. Then I thought to myself, what if I change it to “Reachable on Public Internet” for the SGP VC endpoint? Because SGP was facing the public internet after all…


Solution: I went to SGP’s Connection page in TMS, changed the connectivity to “Reachable on Public Internet”, pressed Save, and made a test video call to the endpoint. Lo and behold, the CDR was sent to TMS!


Now we can finally start gathering CDR for this VC endpoint after years of deployment, and we can give proper usage reports to our managers.