7. November 2014 10:29
I've been using a MacBook Pro at work for some time. It's an 8 MB i7 with a 750 GB hard drive. It's not one of the new Retina's, but it's definitely a nice laptop. I decided to purchase a Mac for my home. I'm not in the mood to spend $2500 for a 15" MacBook Pro. No doubt it's a great machine, but I just want something to dink around with. I ended up deciding on a Mac mini. I purchased a 3 Ghz i7. I upgraded the RAM to 16 GB. I've been really pleased with how well this little box works. The software is very well thought out. I purchased Parallels for OS X and installed Windows 8. It's really nice to be able to work on Windows (specifically Visual Studio 2013) and X Code at the same time. I didn't imagine loading up the Windows machine, but I am constantly adding new developer functionality, specifically SQL Server 2014 and IIS in a spiffy new VM with Windows Server 2012 R2. It's very cool and with 16 GB, I have enough RAM to run both machines simultaneously. It's not like having dedicated hardware, but it is good enough for my needs.
I'll continue to post as I discover new uses for my Mac. I'm already hooked on iPhoto, but I understand that's probably going away. However, the auto import and faces functionality is really nice.
7. November 2014 07:39
I have a small home network that I use for hobbyist programming and R & D on new (to me) technologies. I lost one of my servers (which also acted as a backup domain controller) a couple of days ago. There was no real content on the server that I needed to save, and fortunately, the disk drive is OK, it's the motherboard that went out.
I'm debating about continuing to maintain this home network. It's constantly an issue in terms of maintenance. If the internal DNS server goes down, my wife gets mad. If the print server starts acting up, my wife gets mad at me. I'm working with really old hardware, which could go out at any moment. You get the picture...
So, I'm debating about just shutting down the home network, removing the two workstations I have from the domain, and going back to being an average hobbyist. I primarily need access to a SQL Server and an IIS Server. I have a couple of options. I can use the services on WinHost as my primary environment. I also have an MSDN account through my company that allows me to set up sophisticated environments for development. In other words, I have the capability to continue developing, while simplifying my life.
I'll think on this a little more and hopefully get some things implemented over the weekend.
5. October 2014 15:24
If you ever get the error: "Update-AzureVM : BadRequest: The value for parameter 'SubnetNames' is null or empty.", do not fret. The fix is simple. The VM that you are trying to assign a static IP does not exist in the subnet you are trying to assign it too. I ended up deleting and recreating the server, making sure I used the "Gallery" and not the "Quick" creation option. Then, ensure you create the VM in your private network.
25. December 2013 09:44
I've been involved with Git at work since late May of 2013. Transitioning to Eclipse, Java and Git was a bit intimidating in the beginning. However, after some bumps in the road learning how to utilize Git, I became a convert to this version control system. It's light weight, fast and runs well on Windows Server 2012. I'm slowly migrating my projects off of TFS 2010 to a local Git repository. What really sold me on Git is the ease of branching and merging operations. Then, I discovered the add-in component for VS 2013 works as well or better than Eclipse's integration of Git. About the only thing I am losing when I leave TFS is work item tracking. However, that function will be filled by a $10 copy of Jira for my personal use.
I don't own powerful server hardware. So Git is a Godsend compared to how TFS 2010 and Sharepoint 2010 drag on my outdated servers.
This series of blog posts will cover my experiences installing and configuring GitStack.com's version of Git for Windows:
VS 2013 and Git
- Stuff to download: (Prerequisites)
- First, check out this post on MSDN to make sure your VS 2013 git client is operational. I strongly suggest you make sure you can connect VS 2013 to a github or bitbucket project and clone successfully before you even begin downloading anything else. (I started out using github, but switched to bitbucket. Bitbucket allows me access to Jira functionality (issue tracking) and code reviews (limited.) It's free and allows private projects.
- I like to install a command line git tool. I use Chocolatey to install git command line tools. I've run into some issues getting this to work over SSL, but I've solved all the problems and documented them in this series of articles.
- If you are going to use Git over SSL, you'll may need OpenSSL for Windows. GitStack needs an RSA key file, and OpenSSL is the only way I know to generate this file and create a certificate request tied to the key file.
- You'll need the gitstack.com git installer.
- I generally like to have a copy of Cygwin available. This is completely optional, not required and not really used for this install (unless something goes incredibly wrong.)
- There's probably some other stuff I've forgotten. I'll add downloads here as time and memory permits.
- Configure ports
- Firewall Changes
- Users and Groups
- LDAP/Active Directory
Since it's Christmas morning and the family is waking up, I'll fill in the details later today and tomorrow. I need to go prepare Christmas breakfast for the World of Ware Crack crowd.
16. March 2013 07:48
I was on call last week, during some incredibly hectic network switch overs and server patching. I learned a lot about our infrastructure, and unfortunately, found out about several Windows Services that only one or two people knew about (and only one of the knowledgeable people was accessible.) This little incident caused quite a few managers to be awakened in the middle of the night to handle a problem that was really quite simple to fix - if we had a smidgen of documentation. This event was a very stark reminder to me that companies have to spend time documenting their environment.
If we had a cross reference of services running on the affected servers, we could have solved our problem in a couple of minutes instead of a few hours.
The problem that all companies face is staff turn over. When a staff member leaves, and there is no documentation, the knowledge is lost. If the company is lucky, you might be able to fish the information out of an email archive. But mostly, the company is stuck with re-learning the process. This is expensive, embarrassing when it affects customers and time consuming.
You have to make time to document your processes and develop operating manuals.