C++: Exporting std::string and other STL types from a DLL

The intelligent programmer breaks an entire project into a collection of library projects and a single executable project. Instead of working on one, mammoth executable file, the project is logically divided into functional parts. Besides the advantage of managing a solution in a more organized fashion, compilation times are reduced and the code becomes more easily reused.

I always start out using static libraries in a solution… and eventually find reason to change some number of static library projects to be shared libraries at some point. The advantage of a shared library over a static is that one can package all of the dependencies into the library file; instead of the main program having to import all the dependencies for your shared library, as well as its own dependencies, the shared library packages everything it needs. Most of my shared libraries end up becoming git submodules that I share between several solutions. Instead of having to deal with importing several dependencies in the main executable project, the shared library takes care of itself.

When you’re building on *nix, this is no big deal. Change your project output type to a shared library and you’re pretty much done. On Windows, however, this is a completely different matter. In DLL projects, function declarations need a special import/export macro. That’s easy and obvious enough; 30 seconds on Google will provide even the most novice C++ programmer a working example.

The bigger problem has to do with c++ STL types. Windows DLLs have trouble exporting them with an entire class. Even if your STL type is a private member of a class that will never be passed outside of your DLL, it still affects the size of your object. If your DLL was created on one version of a compiler, but the calling executable was created on another, chaos could ensue.

In a common header file (MyLibCommon.hpp), we define our macros. Because std::string has the same export throughout the program, I define it the shared header. std::vector needs to be defined on a case-by-case basis, so I tend to add those declarations to the header file of the respective class.


#pragma once

#if _WIN32
# ifdef MYLIB_EXPORTS
# define MYLIB_DECLSPEC __declspec(dllexport)
# define MYLIB_EXPIMP_TEMPLATE
# else
# define MYLIB_DECLSPEC __declspec(dllimport)
# define MYLIB_EXPIMP_TEMPLATE extern
# endif
#else
# define MYLIB_DECLSPEC
# define MYLIB_EXPIMP_TEMPLATE
#endif


/*
* Export STL types from the DLL.
* Doesn't create a std::string, but causes its definition to be included
* into the DLL file.
* Gets rid of error C4251
*/

#ifdef MYLIB_EXPORTS
// for std::string
MYLIB_EXPIMP_TEMPLATE template class MYLIB_DECLSPEC
std::basic_string<char, char_traits, allocator>;
#endif

 

Now we can export our class without warning or concern. Oh, and I’ve included an example of a std::vector, as well. Note that we have to define the std::vector for the type that will be used.

#include “MyLibCommon.hpp”

#ifdef MYLIB_EXPORTS
MYLIB_EXPIMP_TEMPLATE template class MYLIB_DECLSPEC std::vector;
#endif

class MYLIB_DECLSPEC MyClass
{
public:
… yada yada yada …
std::string someString;

protected:

std::string anotherString_;
std::vector someVals_;

};

The end result is this the STL types used will be exported into the DLL file. Now, Windows knows exactly what to build on creation of the class. No more warning C4251. Because the macro switch is written to consider the possibility of being built on a *nix system (e.g. MacOS or Ubuntu), none of this occurs on any system other than Windows. Thus, the class builds cleanly on any system with a C++ compiler.

 

Advertisements

The Ugly Truth about C++ inline functions

C++ is my typical language of choice, and one that has stood well the test of time. Like C before it, C++ is ingrained into computing and no modern language is practically-capable of replacing it, other than C, and one just wouldn’t do that. Still, it’s not without its warts. Often times in C++, not everything is what even advanced C++ programmers believe it to be. Case in point: the inline statement.

The C++ language wants us to do away with the oft-maligned #define statements. Instead, we are told that such statements are replaced with the inline qualifier, which will copy your function “inline” to where it is called, giving the same performance advantage of a #define macro, but in a more concise format that is far easier to debug. Sounds great, right?

In other words, this:

#define QUICK_ADD(A,B) A + B

is supposed to be the same in performance as this:

inline int quickAdd(int a, int b) { return a + b }

The lie is so pervasive that even the Google C++ Style Guide — a comprehensive, albeit misguided, document written by one of the most advanced technology companies in the world — attests that inlined functions can replace #define functions with the same performance. Nothing could be further from the truth.

The advanced c++ programmer will now comment, “Okay, if the inline function is more than a few lines, then they can run slower…” Again, false.

In fact, the inline statement is a suggestion to the compiler, and one it can choose to ignore. It may even inline a function without the suggestion. If it takes your suggestion, there is an excellent chance that your code will run slower. The idea that keeping an inline’d function under a certain length to ensure it improves performance is nothing but a myth; the effect of the inline statement is so unpredictable that the only way to determine its effect is to compile, run, and then profile the code.

The only truth is that the code will be easier to debug than a #define macro, but when the fuck did #define macros become so terrifying? And who the fuck said this job was going to be easy? Add in the fact that the #define macro above was typeless while the inline‘d function would have to be over-loaded for every available type, and the inline keyword can only be considered an utter failure. I’ve never met a programmer that would willingly prefix a function with “something unpredictable may or may not happen and it more than likely will have a negative effect.” Yet every misled c++ programmer that has typed inline has done exactly that.

My general rule of thumb is this: If shit happens, and not a programmer on the planet can predict exactly what that shit would be, don’t do that shit. Don’t use the inline statement, don’t be afraid to #define macros, and don’t listen to people that confuse the words “new” and “improved.”

Convert a Mercurial Repository to Git on Windows 10 in 2017

Google the title of this article and you’ll almost always be told to use fast-export (https://github.com/frej/fast-export) and you’ll find additional instructions for making it work on Windows 10. The problem is that, while I had used previous versions of fast-export successfully in the past, the latest version wasn’t working on Windows at all.

Still, fast-export works well on Linux and is generally your best solution for converting a Mercurial repository to Git. While running natively on Windows, or even BASH on Windows, seems broken, I found a better way: Bash on Ubuntu on Windows.

Open PowerShell with Administrative privileges and run:

Enable-WindowsOptionalFeature -Online -FeatureName Microsoft-Windows-Subsystem-Linux

Turn on Developer Mode by opening Settings -> Update and Security -> For developers. Select Developer Mode.

Now, open a command prompt:

C:\> bash

This will install Ubuntu on Windows. Go through the installation process, then create a user name and password. When it’s done, I’ve had better luck closing that command prompt and opening Bash on Ubuntu for Windows from the Start Menu.

Let’s start by installing the necessary software.


$ sudo apt-get update
$ sudo apt-get install build-essential
$ sudo apt-get install git
$ sudo apt-get install mercurial
$ sudo apt-get install default-jre

We need to handle some “trust issues;” namely, mercurial isn’t going to trust certain files in the repo because… don’t know, don’t care.


nano ~/.hgrc

In the text editor, paste the following:


[trusted]
# Prevents warning : not trusting file .hg/hgrc from untrusted user root, group root
# If initial repository clone was done via windows the user and group will be set to root
users=root

[ui]
# Stops us using vimdiff when resolving conflicts, people familiar with vi can omit this
merge=interal:merge

[extensions]
# Lets us set the eol setting when cloning to windows where we will most often be editing the files
eol=

[eol]
# We want files checked out with windows line endings
native=CRLF

CTRL-X to exit and save.

We need to navigate to the directory on your Windows PC in which you store your projects. Note that, in Ubuntu, your C drive is at /mnt/c. From there, you can navigate throughout your hard drive. On my computer, I keep all my working projects and repositories under Documents/srcroot. So, on the command prompt, I would navigate to /mnt/c/Users/MYUSERNAME/Documents/srcroot.

cd /mnt/c/Users/MYUSERNAME/Documents/srcroot

Now, we clone down fast-export.


$ git clone https://github.com/frej/fast-export

We’re also going to want the BFG Repo-Cleaner.
wget http://repo1.maven.org/maven2/com/madgag/bfg/1.12.16/bfg-1.12.16.jar
mv bfg-1.12.16.jar bfg.jar

I recommend downloading the latest version of the BFG Repo-Cleaner by finding the link for the latest version at: https://rtyley.github.io/bfg-repo-cleaner/
(I went to the site in my browser, right-clicked the download button and copied the link address, but feel free to be more clever.)

You only need to do all that setup once. Now, let’s get on to actually exporting your Mercurial repository to git.

In this example, I now have in my srcroot directory, among other folders, these sub-folders and files:

+ srcroot
– bfg-1.12.16.jar
– + fast-export
– + some-hg-repo

I need to make a new git repo for my exported project.


$ mkdir my-git-repo

So, now my directories include:
+ srcroot
– + fast-export
– + some-hg-repo
– + my-git-repo


$ cd my-git-repo
$ git init

And now we put fast-export to work.


$ ../fast-export/hg-fast-export.sh -r ../some-hg-repo --force

Note that I’m forcing fast-export to ignore case insensitivity issues and nameless HEAD errors. Unfortunately, there isn’t a practical workaround for either of those issues.

Most likely, the exported Git repository is HUGE, and that probably isn’t all Mercurial’s fault. We can quickly and easily prune down the repo using the BFG Repo-Cleaner. Most important is to cut out any files larger than 10MB; anything larger than that should be stored using Git-Lfs.


java -jar ../bfg.jar --strip-blobs-bigger-than 10M .git

Then we need to commit the changes to the Git repo.


git reflog expire --expire=now --all && git gc --prune=now --aggressive

The BFG Repo-Cleaner is beautiful work. Please consider supporting the project on its webpage.

At this point, we have a clean Git repo. Configure your remote, push it up and get back to writing that next great program.

The Business Macintosh: A geek gets up and running.

As mentioned in a previous article, I recently ordered a MacBook Pro to replace my high-powered Windows machine. This is a no-compromise transition; I need to do everything I did on the Windows machine, and it has to be easy to do so. When my new machine arrived, I had already done a lot of research, planned for the transition and even helped a friend setup his new iMac, so I was pretty sure I knew what I was getting into.

In truth, there have been only a few surprises, all of them minor. In this article, I’ll detail dealing transitioning from a Windows machine to the latest Mac and offer some solutions for the problems that may occur.

THE ETHERNET PORT — No Soup For You.

My first speed bump came before I every turned the system on, and the blame is squarely mine. I went to the Apple store before ordering the computer, I thought I had checked for an Ethernet port. If I did, I checked on an older model. The MacBook Pro I ordered has no hard-wired Ethernet port, only wireless. My stomach dropped because I often need a hard wired Ethernet port for work when setting up industrial automation for work. It’s not a day-to-day need, but it is a deal breaker.

A quick Google search revealed that I had simply missed the fact that Apple has removed hard-wired Ethernet ports from their laptops. Frankly, this isn’t surprising; I hardly see a laptop user plug the port in. The articles also listed a solution; for $30, Apple sells a ThunderBolt-to-Ethernet port adapter that is literally plug and play; not even a driver needs to be installed. I later swung by the local Apple store, picked one up and it worked instantly. This solution is, for me, ideal; I only use the Ethernet port when I’m plugging into embedded or industrial networks for programming, and I already have to carry and assortment of cables for my job. I will toss the tiny adapter in my briefcase, use it when I need it, and, 90% of the time, I’m carrying around a professional laptop that is almost as thin and light as the MacBook Air.

As a side-note regarding the MacBook Air: I don’t see the point. The low-end MacBook Pro is nearly the same price, nearly the same dimension and a better computer. The MacBook Pro on which I’m typing this article is terrifyingly thin; the MacBook Air seems redundant to me.

SETTING UP TO FAIL: The buggy setup wizard.

Considering that ease of use is Apple’s claim to fame, I found it ironic that the buggiest, most problematic part of the operating system is the absolute first thing one sees when they turn on the Mac for the first time: the Setup Wizard. Having just helped a friend with his iMac, I thought this was going to be 45 seconds of clicking and then on to the fun. Wrong.

There are two failures in the Setup Wizard, one minor and one that Apple to should be embarrassed about. Both of these failures happen to be the two features I didn’t use when helping my friend with his iMac, so they took me by surprise.

I’ll start with the minor problem: there is some bug that prevents the Setup Wizard from validating your Apple ID. The Setup Wizard says there is a problem with the Apple servers and that you should try again later. I did some reading and it turns out this is a common problem, and the issue is somewhere in the Setup Wizard. The solution is to skip, then go to System Preferences, select iCloud, then enter your information there. That works and it’s fairly easy, but buggy software relying on workarounds is just so…. Windows. Once I did put in my iCloud ID in system preferences, the rest of the process was smooth. Even my iPhone suddenly displayed a message asking me if it could trust my new MacBook Pro. I said yes, and they’re rather chummy now.

The real problem was the Migration Assistant, which Apple should either fix immediately or remove from the operating system, never to speak of again. On the surface, the Migration Assistant seems like a treasure for anyone moving to a new computer, be it from Mac or Windows. It works well for neither situation.

Problem Number One: The Migration Assistant takes over your Mac for however long it’s copying files. I was transferring 43GB of music and data from my old machine, which is about 4 hours of transfer time. On the Windows machine, where I installed the Migration Assistant’s companion program that would send the files over to the Mac, the computer otherwise ran normally. No matter whether you start the Migration Assistant from the Setup Wizard or later from within the Applications folder, you’re not doing anything but migrating for awhile. Of course, this often leads to people having better things to do and choosing to migrate while they’re sleeping at night, so they skip through the Setup Wizard and run the migration later. This leads to Problem Number Two.

Problem Number Two: If you run the Migration Assistant from the Setup Wizard when you first turn on your computer, it pulls the files into your user account. If you run the Migration Assistant later, it makes a new user account and leaves the account you’re actually using untouched. This makes no sense. You’re migrating, not copying users for an IT department. The only reason anyone would run a Migration Assistant in the first place is to pull files into their current user account; that’s what the name “Migration Assistant” fucking implies. I woke up in the morning to find that my MacBook Pro had indeed pulled all my files off the Windows laptop… and put them in a new account. Some Google searching revealed that this happens to almost everyone that tries to use the Migration Assistant. I also learned that the Apple Store won’t touch the program; they transfer the files for you by copying manually.

There is a trick, however, to fix this situation. It’s again another workaround, but if you’ve already spent the hours pulling over the files and found they all went into another account, the time need not be wasted.

— Delete the user account the Migration Assistant created.

— You will be asked if you would like to save a copy of the user data. Say yes, as a disk image.

— A disk image of all your files is created. The nice that that happens here is that all your files are in the directory structure you need to copy and paste right into your own user data location, thus putting all the files where you wanted them in the first place.

I navigated to the disk image, opened it up, grabbed the music directory, pasted it into my own music directory, then opened iTunes. Boom. My music library exactly as it was on the Windows machine. I did the same for my documents, photos, etc. When I was done, I deleted the disk image from my hard drive.

It was a good way to make the best of the four hours of file copying my MacBook Pro did, but if I just wanted to copy files and manually drop them into folders, I could have done so without the Migration Assistant. The rule of thumb is: if you can get the Migration Assistant to work for you during the Setup Wizard and you don’t mind waiting 3 – 10 hours for the process, then it might be worth it. But if you’re at all computer savvy or need to skip Migration during the Setup Wizard, you’re better off copying manually.

This is where the negativity ends, however. The rest of setting up my new MacBook Pro was fast and simple, and the system is an absolute joy to use.

THE GOOD AND THE BEAUTIFUL

I’ll skip over detailed reviews of the hardware, for which there are numerous articles. Suffice it to say that this is the laptop to beat. The Retina display is the absolute best I’ve ever seen. The keyboard looked like it would take some getting used to, especially after I found the iMac keyboard to feel clumsy, but the MacBook Pro keyboard is superb. I’ve never been a fan of touch pads; I find them slow, sloppy and awkward. I usually disable them and carry around a traditional mouse. The Mac trackpad is the first I’ve liked on any computer. I’m still a handheld mouse lover, but this is the first touchpad that actually seems worth installing into a computer, and the iPad-like gestures it supports for scrolling make it a good mouse companion.

The idea isn’t to compare Dell to Apple, though; I’m interested in how well Mac OSX works when compared to Windows, and how hard it is to get up and running with my complicated setup. Frankly, the Mac did not disappoint.

The network performance of the machine is the most striking difference. Side by side, the same network, both connected over wireless, the Windows computer more powerful and its virus scanner disabled, the Mac loaded pages, downloaded files, uploaded files and simply responded more quickly than the Windows machine.  And just in case you think I’m comparing a Chevy to a Ferrari, keep in mind that the Windows laptop is a portable workstation that cost $500 more than the MacBook Pro. The Bitcasa application I use was the clearest example of this. I recently switched from Mozy to Bitcasa for offline backup and mirroring. The Windows machine took three days to sync up 50GB and could sometimes drag the computer to a crawl while uploading. On the Mac, Bitcasa mirrored the 50GB overnight and at no time seemed to cause any load on the computer. Certainly, I’m not doing any type of scientific measuring, but after two days of using the machines side by side, the Mac screamed through every Internet-related task.

In fact, everything feels faster. The operating system has a certain “snap” to it. Click the “Finder” button (the equivalent of Windows Explorer) and it instantly appears. When a program pops on the screen when using the Mac, it’s ready to go. Type a search term into Spotlight (the equivalent of the search box in the Windows Start Menu) and the results appear instantly. Searching on Windows is still a slow, tedious process. On the Mac, it’s an absolute joy. Microsoft’s utter failure to implement a reasonable, performant search engine is glaring, and can take several minutes to respond. OS X returns results in less than a second.

Downloading and installing software from the Internet was quick and easy, and the App Store in OS X is actually a nice touch and brings iPhone/iPad familiarity to installing programs or updates. Most people have no trouble dealing with the App Store on their iPhones or Android devices, but have panic attacks when a program or update needs to be installed. The App Store may sound like Apple’s silly attempt to merge the OS X and iOS experience, and that was my take on it before I had used the program, but watching a 68 year old man that didn’t speak English update the operating system on his new iMac with a calm click changed my mind quickly.

WHEN IN ROME…

There is a slight adjustment switching to the Mac, emphasis on slight because, up until Windows 8, Windows was borrowing heavily from the OS X design. You’ll have no trouble using the computer of finding the files, but you’ll have to adjust to clicking the cute little Mac Face on the Dock to open a Finder window instead of a little folder. If you’re into using keyboard shortcuts, as I am, they are different. And the Delete key is actually a Backspace key, but that’s really just a change in terminology.

Then there is the menu bar always being at the top, which I’ve always been a fan of, but it does seem jarring for most former Windows users. It actually makes a lot of sense; the original designer of the Mac realized the top menu was easier navigation; instead of having to fit your mouse into the narrow menu, you just head up to the top. The combination of the menu bar on the top and the dock bar at the bottom is a winner, in my opinion. I have a very visible and readable interface, icons for the background processes in the upper right, a clear menu in the upper left, and icons for more than 20 programs a constant at the bottom of the screen. Sure, you could pin a few programs to the Windows 7 task bar, but not 20, and Windows 8 is a mess on the desktop. The compromise of having the menu bar always at the top on the Mac is that you sometimes have to look in the upper-left corner to see what program the menu is referring to. This sounds tedious, but, in practice, the menu is almost always set to the program you expect.

What Windows users will miss the most is the “maximize” button. On Mac, every windows sports a red close button, a yellow minimize button and a green… I’m not exactly sure what to call the green button with a “+” sign. It’s not strictly maximize, although some programs wisely treat it as such. Other programs will simply make the window taller and you’re left to drag the window to fill the screen. OS X Mavericks, however, does feature a new button on the right side of the window, which turns the program into a full-screen application. This is a feature I love and has pretty much replaced the maximize button for me.

The mouse wheel will take a little getting used to; it works opposite of Windows because Apple decided that all scrolling should mirror the iPad. This is a decision that works well on the touchpad, not so much on the mouse wheel. Then again, I’m using an old mouse from a Windows PC, while the new Apple mouse doesn’t have a wheel; you use your fingers like you would the touch pad. On my friend’s iMac, I was impressed by the mouse, so it’s very possible Apple made the right decision and I’m still clinging to a Windows mindset. Still, my MacBook Pro doesn’t come with a mouse and they’re $70, so I’ll be adjusting to the mouse wheel for awhile.

[Edit: For half the price, I bought the Logitech Ultrathin T631, which offers the same features as the Apple mouse in smaller, more portable footprint.]

Overall, however, I see nothing in OS X that will send a Windows user running back for Windows 8, or even 7, while there is a lot to fall in love with.

MICROSOFT OFFICE… THE MIXED BAG

One of the first programs I installed was Microsoft Office, since my company would consider this a deal-breaker if the software wouldn’t run on the Mac. The good news is that Office, particularly Word and Excel, work and run well; in fact, since the Mac versions were published in 2011, they feature the desktop-style interface that was far more usable. My suspicion is that someone switching from Windows to Mac and installing Office will breathe a sigh of relief. Word, Excel and Publisher run fast and are easy to use.

So where are the complaints? They’re somewhat nit-picky. Not all the icons in the Office programs have been updated to handle a Retina display, so the larger icons tend to be grainy. Oh well. The main problem is Outlook, which is a program I have never liked. It’s slow on Windows, it’s slow on the Mac. Not the interface, which is snappier than its Windows equivalent, but downloading email, syncing with the Exchange server… it’s no faster on the Mac. All the same problems are here, including sketchy operation if your inbox gets near 5,000 messages. It also decides to go through and download your entire email account onto the computer, including attachments… which isn’t really what a business user would want; your IT department is already backing up your email on the server, so why have a third copy of messages you’ll be ignoring a few days later? While Outlook is busy spending days downloading every attachment you’ve ever received (and chewing up hard drive space), it’s slow to realize you have new emails (sometimes 12 hours or more slow). Initially, I thought Outlook simply wasn’t working and had switched to using Apple Mail, but then I did some reading and found out what was going on. The best solution is to set Outlook to only download the email headers; it leaves your email and attachments on the server unless you click on the email to read it. You save hard drive safe and Outlook works like a charm after that.

Apple Mail, which now supports Microsoft Exchange, syncs far more quickly and works out of the box. It’s easily the best desktop mail client, but ultimately I decided to stick with Outlook because it supports some advanced Exchange features that a business user will use regularly, particularly with shared calendars and appointments. Outlook is what your company is going to want you to use and it will make your Mac fit in easily in the corporate setting. Do you really want to be the person that screwed up a group appointment because you dragged it to a new time in Apple Calendar? However, if you’re not a business user and you’re just connecting to a personal email account, stick with Apple Mail even if you purchase Microsoft Office. Outlook was never designed with the individual in mind, no matter the operating system.

There isn’t much else to say about Microsoft Office on the Mac. Everything works like the Microsoft Office a Windows user has come to expect, only the interface is a bit more familiar to someone that grew up with a traditional desktop.

GRAB THE REMOTE: RDP into the Windows Machine at Work

Because we’re a micro business and I’m the lead developer, most of the IT work falls to me, particularly if there is a crisis. Most crises tend to happen off-hours or when I’m traveling, so I frequently use the Windows Remote Desktop to take over the server or a machine and fix a problem.

This was an early concern for me, and I had already installed Windows 7 into a VMWare Virtual Machine and logged into the server through there before it even occurred to me that Microsoft might be supporting a client on the Mac… which it does.

Remote Desktop Client for Mac: http://www.microsoft.com/en-us/download/details.aspx?id=18140

There isn’t much to say. Install it and use exactly as if you would from Windows, but without any need for Microsoft Windows or VMWare.

 

ONE BROWSER TO RULE THEM ALL…

Out of the box, your Mac is going to have Safari installed as your default web browser. Safari is a wonderful browser that is lightening fast and easy to use. Unfortunately, some advanced sites simply won’t work with it, and most websites are starting to employ advanced desktop-like features. Google Chrome remains my favorite browser, regardless of the platform, and chances are good that you were already using the browser on Windows. On OSX, Chrome is a 32-bit program and doesn’t render pages quite as quickly as Safari, but will work with every single web page you’ll try to load. Unless you like switching between browsers, install Chrome and set it as your default.

 

WINDOWS NETWORKING TIPS

If you’re using your Mac at work — and why wouldn’t you? — then chances are good that you’ll have to connect to a Windows server. OSX is perfectly capable of doing this, but there are some tricks to make the process smoother.

Let me start by saying that, at my tiny office, we have a very poor Windows server that is not set up with comprehensive support for Mac or Linux. If you work for a large company, you may have no issues.

Windows Networking works, but it can be slow unless you use CIFS to connect. For whatever reason, it simply won’t be as quick to use Windows networking shares on your Mac is it was on your Windows machine, but CIFS makes the difference negligible. What is CIFS? Nobody cares. To connect to your Windows server:

  • Click on the Finder Icon.
  • In the Menu Bar, select Go->Connect to Server
  • Enter the address of your server. It’s probably something cute like mycompanyname.local, or it might be a server IP address (192.168.16.2 is a good bet). You’ll probably know this already. The trick is that we want to preface the address with “cifs” not “smb”.

Example: cifs://192.168.16.2

My advice, and this is regardless of operating system, is to copy down the file(s) you need, work on the file(s) locally on your machine, then copy the file(s) back up when you’re done.

There are supposedly some other tricks for making your Mac network more quickly with a Windows network. I haven’t tried them yet, but I will be. I’ll post any progress on this blog.

 

FINAL THOUGHTS

After 20 years in the world of Microsoft, I’ve been using a Mac full time for three months. In that time, I’ve used it to do advanced web programming, Linux kernel module development, application development in C and C++, industrial programming and serial communication. The only speed bump I’ve had is research into the programs I need to get the job done because I’m new to the ecosystem, but I’ve never needed to grab my Windows machine to do my job, although I do pop into Windows in VMWare from time to time. For three months, I’ve never had to remove spy ware or viruses, never had a long boot time, never had to spend a day servicing or tuning up the computer. I turn the machine on and it works, which is all I’ve ever wanted from a computer. A computer is a tool, not a hobby, and I couldn’t be happier with the MacBook Pro.

 

Programs you may find helpful, most of which I found in the App Store:

  • SerialTools for serial communications. If you need a USB->Serial port, you’ll want one with an FTDI chipset. I searched for this on Amazon: FTDI Chipset High Speed USB 2.0 to Serial RS-232 DB-9 Converter.
  • AVI video doesn’t always play on Mac (or even Windows), but some weirdos will still send them to you. If that happens, a free app called SmartConverter will quickly convert the video for you.
  • I use SSH often. I found a program called vSSH Lite easier to use than the Terminal command line that is built-in, although I have been using Terminal more and more.
  • iZip Unarchiver is a must-have for dealing with zip files.
  • If you’re looking to do 2D CAD, DraftSight is free from the company that makes SolidWorks and is 100% AutoCAD compatible.
  • iDraw is the best graphics program for those of us that suck at doing graphics. Our local Photoshop and Illustrator expert actually needed me to convert some files for him. Worth every penny.
  • VMWare is far superior to Parallels.
  • Yummy FTP sounds silly, but works great.

I fucking love the Wii U, but…

One my favorite lines from a movie is, “More isn’t always better. Sometimes, it’s just more.” It’s a line the pundits in the video game industry would do well to remember.

Years ago, I bought a Nintendo Wii because it was cheaper than the competing systems, more fun than the competing systems and had the games I was most interested in playing, which largely means Zelda with a dash Metroid, and that Wii Sports game was hella-fun. I also had young kids and, though they were a bit too young to be much into gaming, they enjoyed the sports game and Metroid Prime. For a year, I played the Wii regularly.

Things changed a year after that as I decided to “cut the cord” and be entertained primarily by streaming Netflix and Amazon Instant, with the random DVD for shows and movies not yet available online. Here, the Wii failed me; you needed a special CD to do Netflix, no Amazon Instant and forget playing DVDs because Nintendo has some obsession with using proprietary disc formats. To that end, I went to buy a smart DVD player, but there was a sale on the slim PS3. A reasonably-priced top-of-the-line blu-ray player that could do Netflix, Amazon Instant, DNLA and I could try out some new video games if I was in the mood? Sold. As an entertainment hub, the PS3 was unmatched. For games… The Wii went into a box and I played Uncharted for a month or so, but I never finished it. While I used my PS3 daily for Netflix and Amazon, I never much used it for gaming.

When my kids were older and wanted a system, they didn’t much care for the PS3 so I purchased them an XBox 360. Here begins the story of my return to Nintendo. My kids spent two years playing Lego Star Wars and Lego Avengers on their XBox 360, with little interest in any other games. The XBox 360 was, from a parenting perspective, a nightmare. First off, it easily has the worst user interface of any game system I’ve ever seen. It’s cumbersome for an adult to use; a nine year old must constantly fetch a parent to help him navigate the complicated menu mess. Just because the system plugged into the Internet doesn’t mean it’s on the Internet. Unless your child picks the right account linked to XBox Live, he can forget about streaming. Then, of course, we have the games issue. There really isn’t much for young children on XBox; both XBox and PS live and die by the first person shooter, most of them rated Mature and featuring inappropriate language, sex and alcohol consumption. Now, I’m an open-minded, liberal father the generally enjoys inappropriate language, sex and alcohol consumption, but when my 9 year old wanted Assassins Creed 4, in which the player goes to the bar, orders booze then spends the night with a hooker… yeah, that’s a bridge too far. I said no and told him to pick another game not rated “M,” but he was tired of Lego games, so that left him with… Oops.

At which point, I said to myself, “Man, I should have kept that Nintendo Wii.” By this time (April 2014), the Wii U had been out a year and not selling very well, but new games had started coming out in a steady stream to rave reviews. I showed the trailers of the games to the kids and they went nuts. Mario World 3D, the Wonderful 101, Lego Undercover, Zelda: Wind Waker, and that’s just to name some at the top of my head. These are games that even I would (and have started to) play, and I would never have to worry about finding enough appropriate games on the system for kids. On the subject of games, however, I’m not a big First Person Shooter fan. After about an hour, the fun wears off. Don’t get me wrong, I played plenty of Doom in my day, but adventure games and platformers are what I’ve always played most, and Nintendo owns that market.

So, we’re a Wii U family now, and I’m not going to delve into a detailed description of the product because you can find those throughout the Internet. Instead, I’m going to refute all the complaints you’ll read in a review about the system or in forums on the Internet.

  • The Graphics: Absolutely, positively awesome. It can meet or exceed an XBox 360 or PS3, and Wii U games I’ve played look fantastic. This is the point where “gamers” look at raw stats and say that the system doesn’t compare with the XBox One or PS4. Correct, the system isn’t nearly as powerful, and it really doesn’t matter. For all their power, the XBox One and PS4 pump out graphics only marginally better than those of the previous generation. The XBox One owners I know don’t really see the point over the XBox 360. Don’t let raw specs fool you; the fact is that graphics will not noticeably improve until game systems go 3D, which isn’t going to happen until 3D televisions are standard in US and Asian households. That’s not likely in the next ten years, and not in the cards for any current gaming system. Graphically, you’re going to be satisfied with any modern gaming system, so let the content make your decision.
  • The Games: The knock on the Wii U is that third-party support has stalled. That’s not ideal, but Nintendo has always survived on first- and second-party titles. That leads us to must-have games on the Wii U, of which there are at least a dozen. Compare that to the XBox One which has Titanfall and… Well, I hope you really like Titanfall. The PS4 has some upcoming titles, but overall there isn’t a compelling game library for any of the next-gen systems except the Wii U. In a year, the XBox One and PS4 will have a deeper library, but so will the Wii U, which is currently pumping out a game per month to stellar reviews. A year from now, the XBox One and PS4 will have saturated themselves with a library of high-quality first-person shooters, some good sports games and a remainder of ho-hum games, the best of which are likely to be available across all platforms. Nintendo will never be the darling of the people that just want to play first-person shooters, but the Wii U is shaping up to have the strongest game library overall. For old folks like myself, that grew up on the NES, SNES and N64, the Wii U Virtual Console is an extra attraction, giving us access to a large library of classic games.
  • Media: The days of needing a DVD for streaming services are gone, and the Wii U does Netflix and Amazon Instant with aplomb. This is one of the (few) areas where the touch-screen gamepad really pays off. Instead of picking the letters on the screen to type, you simply use the gamepad like you would your iPad. While I don’t think the gamepad pays off in video games, it works very well as an advanced remote. XBox One media integration, while more comprehensive than that of the PS4 or Wii U, is still convoluted, touchy and at the tune of $500, plus $50 a year for subscription to the service. Even PSN is charging for many of their online services. When it comes to media services, the Wii U is a bargain compared to the other consoles.
  • Sharing of Nintendo IDs: This is something that bothered me on the XBox 360. If the kids didn’t log in as my account, they couldn’t use the Internet, and I certainly wasn’t going to pay for multiple subscriptions. Nintendo IDs are free, and I can link all our “Mii” accounts to the same Nintendo ID. Nice, easy and it works for everyone. Also — and I admit that this is young-child centric — Nintendo does a better job of verifying a parent and parental controls for the system, and they make it very easy to set everything up.
  • Group Play: You can go online and play with an infinite amount of strangers on your XBox or PS4, but if you have 4 or five friends over and want to have some big ruckus, the Wii U is the console that crushes the others. Nintendo is the king of local multiplayer while the other systems really just want you to play with others online.

So, the Wii U is going to take over the world and have a big renaissance in 2014, knocking down Sony and Microsoft, right? Well… I’m not ruling out the possibility because the potential is there, but its going to take some tough choices at Nintendo.

Let’s start with the hallmark of the Wii U, the gamepad, which is the game controller with integrated touchscreen. The innovative Wii-mote drove the rampant success of the Wii, and the Wii U gamepad was supposed to do the same for Nintendo’s newest system. But the Wii-mote wasn’t a simple gimmick; it was a core technology that revolutionized the way games were played on the system from top to bottom and it worked flawlessly. The gamepad works flawlessly… but that’s it. It adds little, if anything, to gameplay; what it does add is almost always nothing more than convenience. You still need at least one Wii-mote and nunchuck with the system, and they’re still the best and most fun controller to use, not to mention the only way to do multi-player. If Nintendo had put out a Wii U that standardized on the Wii-Mote Plus and could optionally connect with your iPad or Android to enhance gameplay, Nintendo wouldn’t have been able to build the console fast enough. Instead, they tried to drive sales with an innovative controller that really wasn’t innovative, confused everyone, drove up pricing and scared off parents that foresaw fights between siblings over who got the lone touchpad controller.

The gamepad is an undeniable failure and the albatross around the neck of an otherwise fantastic system. Consider that the high manufacturing cost of the gamepad leads to the reductions that plague the system with reviewers:

  • The system can’t play blu-ray disks because supporting blu-ray entails paying a licensing fee. 
  • The system has a weaker processor because more CPU means a lot more money.
  • Cheap plastic exteriors. Okay, this is a stupid complaint because I found the system to be solidly built, but more money on the gamepad certainly means skimping on plastics.

Then, there’s TVii, which basically means your gamepad is your TV and cable box remote… If you’re still one of those weird people that doesn’t have a soundbar or external amplifier for your home entertainment system, which anyone that would consider using TVii definitely would have. Again, it works, it just doesn’t drive sales. Make the gamepad a true universal remote that can control my television, my amplifier, my cable box and my blu-ray player and now we’re talking. Still, only the XBox one tries to be your everying-to-everyone media hub, and that doesn’t drive sales either, so this is really a non-issue. If the recent console wars proved anything, it’s that the people buying consoles are still buying them for the games and the gaming experience. TVii is the last thing Nintendo should be worrying about. Rather, I wish the damn thing could play Blu-Ray disks.

Mainly, though, the problem at Nintendo is leadership. CEO Iwata announced today that, once again, Nintendo is losing money and that the Wii U sales are continuing to slow. He followed up that Nintendo will refocus on exploiting the possibilities of the “gamepad,” and that the company is only looking to promote their system on smart devices like iPhones and tablets, not have content or integration with them. Sigh. Nintendo refuses to operate outside it’s little bubble, and that bubble is shrinking. The frustrating part of the situation is that the solution lies completely in software: keep the excellent games coming, pretend the gamepad doesn’t exist, push the WiiMote and publish apps for 3DS, iOS and Android that interact with the Wii U. 

Still, I’m not one of the people lamenting doom and gloom for Nintendo. The 3DS sold 12 million units last year and has an incredible library of games, so Nintendo isn’t anywhere near leaving the gaming market. The Wii U is a solid system that has a great game library behind it; if Nintendo ever pulls its head out of its ass, smart leadership could turn the system into a real winner. Nintendo is never going to pull out of the console market, as so many pundits are fond of predicting, although I do think the Wii U will have a follow up sooner rather than later. Turning a disappointing system into a success by releasing a modified version isn’t unprecedented. Don’t forget that the PS3 was initially considered a relative flop: it was too expensive and didn’t have the games or online capabilities to compete with the XBox 360, which essentially stormed the market, not to mention the API needed to write games turned off a lot of third-party developers. The PS3 Slim and enhancements to the PlayStation Network changed all that. Imagine a Wii U2: Slightly better processor, no gamepad, package it with two wii-motes and two nun-chucks and your choice of Mario Kart 8, Super Smash Brothers or a brand-new release of Zelda. Winner, winner, mushroom dinner.

Those are things I’d like to see, but I’m still happy in the present. For me and my family, this is definitely the system for us. I play Mario with my kids, I’ll probably buy a Zelda game andmy kids finding a lot of fun, appropriate games to play that interest them. If a PS4 is about hard core gaming, the Wii U is about fun. Sure, the XBox One and PS4 can crunch more pixels than the Wii U, but more isn’t always better, sometimes it’s just more. The system with the strongest game library is the Wii U, the games look great and the system costs a good deal less money. When the alternative is being underwhelmed by the XBox One or PS4, the Wii U doesn’t feel like a bad buy at all.

Return of the Mac: A geek leaving Windows

I’m a computer programmer that has spent 90% of his career writing software for Microsoft DOS and Windows. I started when I was eight, got my first internship at 16 and have been working as a programmer in one capacity or another since then. As a person often asked to fix computers, when a family member wanted a recomendation of a computer to buy, I always told them to get an Apple Macintosh because I quickly tired of spending every holiday in someone’s home office removing spyware and updating drivers. I used to tell people how much I loved Apple’s products and their operating system, but my career was never going to allow for me to have a Mac as my full-time computer.

I just ordered a new MacBook Pro to replace my full-time computer.

Let me be clear: this isn’t my surf-at-home computer. I bought it to do Windows programming, industrial systems programming, Linux programming and development, javascript and web programming, 2D and 3D CAD drawing, Microsoft Word, Microsoft Excel… and basically anything else you would think about using a computer for. Yes, I’ll also be using the computer for music production, probably using ProTools, and I’m going to be writing some apps for iOS, but this computer must gracefully replace every function for which I use my Windows computer.

The simple fact is that I’m tired. I’ve given Microsoft and Windows twenty years to fix problems with boot times and decaying performance, security, DLL hell, GUI responsiveness and overall stability. Along the way, Windows has “borrowed” some GUI ideas from Apple that led to an excellent interface concept in Windows 7, but all the basic problems are still there. My boot time has become atrocious, performance is dwindling, I’m constantly removing spyware, hunting down drivers and DLLs, and GUI responsiveness has always been a no-show. A good week is when I’ve only spent four hours trying to fix something on my Windows computer; usually, it’s six.

While Windows didn’t progress for twenty years, Apple brought OS X into the twenty-first century, particularly with the latest version of its operating system. A Mac is now an enticing business platform that should seamlessly integrate into any company’s IT infrastructure and offer an unbeatable user experience, while the Internet and web browsers have nearly made your operating system insignificant.

A common counter-argument from Microsoft die-hards is that people tend to buy cheap, problematic PCs and compare them to the high-quality hardware in an Apple Macintosh. Ah, if only it were true that one could “buy” their way out of the problems of Microsoft Windows. My Windows computer cost 40% more than the MacBook Pro I just bought, has a faster processor and a better graphics card; good luck getting it to boot in less than 10 minutes. It has almost twice the power of my previous Windows laptop (also more expensive than a MacBook), but you would never know it to use the system. My Intel 386 running DOS 6.0 with 16MB of RAM was one of the most responsive machines I’ve ever used, but these days you press a key on your Windows machine and wait. Inevitably, some fool will tell you to replace your computer with a new Windows PC, but at no point since 1995 has the performance of Windows been improved from the perspective of the user.

The Windows-phile will interject: “Well, you just have to…” and list a series of ways to clean and improve the system performance. I know how to do all these things; I get paid to do them for other people. I’ve spent days at a time trying to optimize the performance of my Windows laptop; the results are rarely noticeable, but, more to the point, I shouldn’t need to. A computer is a tool, not a personal project. When I buy a wrench, I don’t want to spend three hours fixing it just so I can tighten a bolt. We buy computers to write our documents, do our jobs, file our taxes, research school projects, post pictures on Facebook and make the next, great album. No one orders a Dell because they wake up one morning and think to themselves, “I want to fuck around in a bloated Windows Registry and reduce the number of startup applications so that my PC boots faster!” Consider the irony of the Windows ecosystem: We buy into an operating system because of its pleathora of available software only to be terrified of installing software and of the decrement the software may yield on the system’s performance and security.

At last, with all the changes in computers and technology, there is no reason to suffer such absurdity.

So, I’m switching to Mac, and I have to admit that the fates have really aligned to make the switch possible for someone like me. The average computer user has been able to switch for years; most people only need a web browser and Microsoft Office to get their jobs done. Ironically, Microsoft Office on the Mac is far more elegant than its Windows equivalent. The last twelve months have made switching even easier. First, Apple has really stepped up their game with OS X Mavericks. Switching from a PC? There’s a Migration Assistant you install to pull over all your files to the Mac. Full support for Microsoft Exchange, which means your work email, calendar and contacts are all at your fingertips with the included Mail, Calendar and Contacts programs, even if you don’t buy Microsoft Office. Not even a Windows PC can offer that. Integration with the network at work is fairly seamless.

In the meantime, Microsoft has hit its user community over the head with their Windows 8 design language (the look and feel of a program) dominating every product they publish, particularly Microsoft Office. Their tablet-first interface is alienating desktop users, and Microsoft couldn’t care less because it’s focused on becoming a cloud and services company; they’re far more interested in selling you a subscription to Office365 than a copy of Windows. My company switched to Office365 six months ago; all I had to do when I opened my Mac was navigate to my account, click the Office for Mac button, and it installed cleanly on my computer. After months of complaining about the latest Windows version of Office, it’s refreshing to use the Mac version, which actually looks like a program intended to run on a desktop, and kudos to Microsoft for delivering an excellent cloud service.

Beyond the basics, I mentioned I write a lot of software for Windows and that I do 2D and 3D CAD. Because Windows drivers don’t work between x86 and x64 versions of Windows, I’ve had to do a good percentage of my programming in a virtual machine already. I was running Windows… inside of Windows. VirtualBox is a capable offering, but I still find VMWare worth every penny of its reasonable price. I’ll be installing VMWare Fusion on my Mac and running Windows inside my Mac for programming, so I’ll be right back where I was on my previous PC. 2D CAD will run fine in VMWare, but SolidWorks could get cumbersome. Luckily, all Macintosh computers come with Boot Camp, so I’ll be able to boot into a Windows partition on the rare occasions I need to do 3D CAD. I did some reading before buying the Mac, and it turns out SolidWorks on a Mac using Boot Camp is pretty common.

Then there is the ultra-odd-ball case like myself: the Linux programmer. These days, the majority of my programming is for real-time Linux systems controlling industrial automation, but VMWare makes this easy. I was already using VMWare on my Windows machine for all my Linux programming, so all I have to do is buy a license for the Mac version and copy over the virtual hard drive file. Done. I’m back up and running with a single file-copy.

Notice that at no point did I consider switching to Linux as my primary desktop operating system, even though I spend most of my time working in Linux. As a big fan of Linux in the embedded, real-time arena, I readily admit that it’s an abysmal desktop operating system. My Macintosh will easily fit into my office network and gives me Exchange email, Microsoft Office, not to mention iTunes and support for my precious iPhone. The Linux community flatly rejects proprietary software and proprietary drivers, and actively drives away developers that don’t conform to the community’s narrow vision of computing that is stuck somewhere in the late 1970’s. When embedding the operating system on specific hardware or running as a server, Linux is fantastic, but as a desktop operating system it has all the problems of Windows with none of the advantages while being slower, uglier and requiring even more maintenance. When I told my office I was switching to Mac, everyone was excited to see how it went. Later, I was told that I was not allowed to switch to Linux full time, Microsoft Office being a chief concern. This was 20 minutes after being congratulated for my success in implementing Linux in our products.

A place for everything, everything in its place… Microsoft’s Office product has long been the cash cow of the company with no real competition, and their Office365 offering outclasses Google Apps on several levels. Windows 8, meanwhile, is a worse failure than Vista that is bleeding into every product, even the successful Xbox, and Microsoft sees its future elsewhere. Apple’s cloud-based services have nice interfaces (I’m actually a full-time user of the iCloud email), but are one of the weakest offerings on the Internet — in fact, you have to have an email address somewhere else to actually use iCloud. Apple Macintosh, OS X, iPhone, iPad, and OS X are the standard-bearers for desktop and smartphone systems and are gaining ground in the business venue, which isn’t their historical strong-point. As for Linux, it’s dominating the embedded market with little sign of slowing down.

As far as I’m concerned, that sounds like a plan, and that’s how I’ll be living for the next few years. Mac and iPhone for my primary devices, Office365 for business cloud services, Linux for my embedded projects. I may be early to the party, but I have a feeling that’s the way the wind is blowing for a lot of people.

BuildRoot: Fixing build failures due to missing hosts files.

BuildRoot is an amazing accomplishment and an absolute must for Embedded Linux, but like most projects in the Linux eco-system, it breaks easily. My particular problem occurred any time I tried to “make” more than once after I had selected a couple packages; suddenly, the “output/target/etc/hosts” file would be missing, resulting in an error similar to the following:

/usr/bin/sed -i -e '$a \127.0.1.1\tbuildroot' \
        -e '/^127.0.1.1/d'
        /home/buildroot/buildroot-2012.08/output/target/etc/hosts
        /usr/bin/sed: can't read
        /home/buildroot/buildroot-2012.08/output/target/etc/hosts: No such file or directory

Restarting from scratch (make clean, make clean all, make distclean) wouldn’t solve the problem, nor did deleting the entire output directory.

The problem seems to occur because some packages will try to write to the “hosts” file on the target before that file has been created by BuildRoot. My solution was to help BuildRoot out by adding /etc/hosts file to the skeleton file system. That way, BuildRoot copied the file into the target file system at the beginning of the make process. I ran another make clean, then the make command then succeeded.

The hosts file had the following contents.

127.0.0.1 localhost
127.0.1.1 buildroot