“Michi’s Minions” – On Respecting Co-workers

Relatively early in my career, I’ve heard the phrase “Michi’s minions” a number of times. People use it to jokingly refer to my team. They say it in private, so I think some people might conclude it’s just a crass joke. Perhaps. For people that know my crude sense of humor, my offense to this joke probably takes you by surprise. Every time I hear that phrase, I immediately conclude the other person is not somebody I want to work for.

To me, it indicates a condescending attitude that the person has toward employees. I once heard the analogy that management is like a rowing team. You’re the coxswain that helps keep the rowboat straight. Yet, when you think about it, you don’t lift a finger to help the results get done. If somebody gets tired or wants to quit, you can’t take out a whip and start cracking. At the same time, without the coxswain, the team will never make it to the finish line. You both need each other. All of my greatest accomplishments as a leader in an organization were because of the hard work the team put in. To forget that your staff were the ones furiously rowing is ignorant if not insulting.

When somebody thinks that “managing” equates to “having minions,” it’s not pretty to watch them get a little power. I’ve seen this a few times now and it had disturbing results every time. The usual trend is:

  1. They give their staff all the boring, dirty work
  2. They scold in public and praise in private (if at all)
  3. They say, “I don’t need to be liked as long as work is getting done”
  4. People start quitting

I want to address #3 really quickly. “Being liked” and “getting things done” are not mutually exclusive. A good leader will get both done together, every time. If you can’t create a work environment where people are happy, you aren’t qualified to be a leader. Think about the last job where you constantly went above and beyond. Did you like your boss? I bet you did. I am very confident in the importance of having a good relationship with those you work with.

This post isn’t about watching your language. It’s about watching your attitude.

You need your team more than they need you.

Social Payments: the Future is Unified

Physical credit cards will soon be a thing of the past. Is the rest of the US startup industry ready?

The next real-world cash-replacement could be powered by Facebook, Google, Apple, Square, Intuit, Paypal, or some other company hiding in the wings.  There are a few obvious names in there, and then there are a few left-field ones to some people. This post isn’t about how those left-field plays could happen. I simply wanted to explain how the landscape is changing.

There’s a convergence happening right now between social, payments, and e-commerce. Imagine this predictable future:

You buy some coffee at Starbucks. You take out your phone and swipe it at the terminal. Your [insert phone app name here] Bucks (from here forth known as: “Phone Bucks”) are deducted from your account. Your purchase is optionally posted on your Facebook/Twitter stream. You get highly-targeted Groupon-clone notice for a Starbucks coupon redeemable online immediately. You decide to buy it using your Phone-Bucks — no signing in, no additional authorizations — by clicking a button.

We’re talking about a future where your online wallet (today, known as Paypal, Facebook Credits, etc.) follows you into the real world and ties directly into your mobile phone. This represents a single unified wallet. And it makes sense. That’s the future. That’s where we are headed now. I’ve been watching this trend happen for the past few years, and it’s exciting to finally see some big players waking up to this reality. Which players are the closest to achieving this? In this order:

1. Facebook – Due to its large install base (virtually all smartphones) and an existing currency platform (Credits), they are best positioned to move into the real world. And they recently made a huge move indicating a desire to do exactly this (creating a subsidiary is the first step in buffering liabilities that come with real-world payments).
2. Square (or Intuit depending on how things play out) – They would solve this from the other direction: they have a stronger real-world presence, and moving into the digital space might be easier than vice-versa.
3. Google – They will approach this from the platform (Android) by opening it (Google Checkout 2.0) up to developers and creating an ecosystem. They also recently stole a key exec from Paypal, so you know they’re serious.

It’s my belief that any startup entering the e-commerce landscape right now needs to make sure they are thinking about this convergence. To get big valuations, I think a startup needs to not only understand these trends but be the first to market in the new paradigm that will be coming (really soon!). This convergence will create an opportunity for new players to emerge and destroy existing leaders. All mobile startups around commerce, Groupon, Paypal, and even the advertising arm of Google are probably already adjusting to these trends. Is your startup?

Think about it.

PHP Tip: Always Put Constants on the Left in Boolean Comparisons

This was a standard I enforced at my last company:

Whenever you are doing a boolean check (such as an IF or WHILE), always place constants on the left side of the comparison.

The following is BAD:

// BAD
if($user == LOGGED_IN) {

The following is good:

// GOOD
if(LOGGED_IN == $user) {

Why is this such a big deal? Imagine the typo where you forget the second equals sign:

// Oops! This always evaluates to true!
if($user = LOGGED_IN) {

This sort of bug is fairly common. C# went as far as to say boolean conditions must always have boolean return values, thereby eliminating the possibility of accidental assignments. Well, since PHP can’t do that, this is the next best thing. Notice how this convention will save your butt:

// Fatal error. Bug caught immediately.
if(LOGGED_IN = $user) {

Think about it. 🙂

Incentives and the Related Dangers

Incentives are just as dangerous as they are powerful. I have the running theory that most incentives can actually do the exact opposite of the intended goal when executed wrong.

Let’s start with an example to illustrate. You’re in charge of a small company that picks up garbage after events like street fairs and parades. However, you just got an angry call from your customer (the city) that your company has been doing an increasingly poor job and they are threatening to cut your contract.

You can fire and hire people, but ultimately, you or some new manager will need to fix the culture of the team. Aside from the obvious choice of talking to your staff about goals and values, let’s assume that incentivizing performance ends up being the option you go with. It’s time to play with fire.

What are some obvious ways to incentivize good cleaning efforts? There are many, but I’ll focus on a really obvious one for this post: Tie bonuses to volume/weight of trash picked up.

Is this a bad incentive? Not necessarily. But if executed poorly, it can be disastrous.

Which is more important to pick up? The pile of 50 napkins or the four empty soda bottles? Under this solution, people would be incentivized to ignore napkins, cigarettes, and plastic bags while encouraged to chase after bottles (bonus points for liquid content) and discarded food. In fact, once employees start realizing this, they might even start picking up rocks and dirt instead of actually cleaning — in effect making the situation worse.

The situation above is universal across all industries. In software, the oft-cited “Dilbert” situation is when performance gets tied to lines of code written. The point is, any system introduced that attempts to incentivize a certain type of behavior can cause employees to focus on the wrong thing. If you tell your staff that closing bug tickets is tied to bonuses, your entire team will focus on that metric like a laser. This will be good at first until you realize that everybody is spam-fixing the “misspelled text” bug tickets and nobody is bothering with the REAL problems.

The Destruction of the Head Hunting Industry

This is a random thought that just popped in my head.

With information becoming increasingly available, I’ve been thinking that the headhunting business will go through a major destructive phase in the next few years. There are two things the Internet changed:

  • Better distribution of information on job openings
  • Better distribution of information on candidates

Definition: For those of you who are unaware, head hunters are professionals that search for employees and pair them up with open positions in companies. In a typical scenario, a company will pay a recruiter (head hunter) a fee that equates to 2-3 months of that employee’s yearly salary. Companies pay this because recruiting employees is expensive. I’ve done a lot of hiring in the last few years, and I know how time consuming it is to review hundreds of resumes and then interview. A head hunter is basically an outsourced HR department. Additionally, candidates often approach head hunters who re-post job openings in various job boards.

And there’s a third trend that will come based on increasing information available to the public:

  • Automation of job and candidate pairing

A long time ago, I was business partners with a man who was formerly a head hunter. I remember him telling me how wonderful the internet made his job. He told me that when he was my age, recruiting meant shaking a lot of hands, memorizing every face and name you ever met, and storing large piles of business cards. For him, recruiting was now about posting jobs on Craigslist and Monster and referring the candidates. To him, he was still the gatekeeper. These days, anybody can be a headhunter with a little Internet know how.

head hunter productivity chart
head hunter productivity goes up first, then down (we are in the middle stage now)

However, sites like LinkedIn can change all that. The one true value proposition that headhunters provide is that they serve as matchmaker. But as more information is available and technology improves, this process should become more and more automated. For example, right now, LinkedIn has job postings. On its own, it’s just a new competitor to Craigslist, but what makes things interesting is that LinkedIn also has the data points to find all of the candidates out there that might fit the job requirements — without anybody lifting a finger.

Right now, the information stream is mono-directional: job postings (and recruiters) broadcast information. The goal is a bi-directional system where seekers fill out their requirements (a.k.a. their resumes) and both sides let the system do the matching. This can only work if both sides have maximum information about the other. Think of it like a dating site for job seekers. It’s a hard problem to solve given the time-sensitive nature of job searches, but it’s an inevitable outcome as more and more information centralizes onto the Internet.

5AM thought of the day.

Google’s Real Goal Behind All Their Free APIs

Ever wonder why Google gives away so many web-developer tools? Tools that otherwise seem like complete money-and-bandwidth-pissing schemes (notice how most of these don’t directly show ads):

This is all about obtaining browsing behavior in a long term bid to increase ad efficiency. Nothing else.

  1. It is not about making things more “open”
  2. It is not about making web development easier
  3. It is not about making an online operating system
  4. It is not about competing with Microsoft
  5. It is not about making the Google brand more ubiquitous
  6. It is not about showing ads in new places

If any of these above things happen, they are a (likely planned) side effect. For example, if a particular API makes something easier, that is good because it will encourage other developers to adopt it as well. But as I will explain shortly, the commonly held beliefs about Google doing Good or Google making the web more open are simply not the reason for these initiatives.

If you notice, all of their APIs use JavaScript. This means all of their APIs have the ability to note what computer a given request is coming from. This means that on top of your search preferences, they can eventually begin to correlate your browsing habits based on the sites that you visit that use Google APIs.

For example, if my blog were to use a YouTube embed, it would be possible for Google to read a cookie originally placed on your machine by YouTube and correlate it as traffic coming from this site. This means they can uniquely track every YouTube video your computer has ever watched since the last time you cleared your cookies. YouTube is just an example because most of Google’s APIs are far less obvious to the end user. For example, the unified AJAX libraries could be used by a good half of the “2.0” websites out there without impacting performance (and in many cases would make the sites load faster for the end user). But because everything is going through Google, it’s possible (although I’m not saying that are) for them to track which sites you visit.

If this isn’t extremely valuable information, I don’t know what is. Don’t forget that the AdSense API is, in itself, a means for Google to track every website you’ve ever been to that uses AdSense, and for a way for Google to know exactly which type of ads interested you in the past. Once they know what sites you visit, they can surmise what a given site is about, and then determine, for example, what sort of products would interest you.

It’s the classic advertising chicken and egg problem: If I knew what my customers wanted, I could sell it to them, but they won’t tell me.

…And Google found the chicken. For the time being, they haven’t started using this information (at least noticeably), but I am sure they will as market forces move to make competition in that area more necessary.

Say goodbye to privacy. =( Oh wait, I’ve been saying that for quite some time now.

Debugging Tips for Database Abstraction

Today I want to talk about database script debugging in large systems. The main problem is that in large applications, it becomes difficult to find the source of rogue queries that, for example, broke in a recent system update.This may not readily apply to most of you, but bear with me: some day it will.

Pretend for a moment you have a database architecture where you have 2 masters (dual replication) and 2 read-only slaves. Now pretend that you have a large application with 100 different pages/scripts. You have 5 web servers with mirror copies of the application. This would be a fairly typical setup for a small, but growing company.

One day, you come into work and find out that you had a bad transaction lock that caused your system to hang all weekend. So you look at the process list and you know what query is causing the problem (because it’s still stuck). The problem is that it looks suspiciously like the queries you’d find on virtually every page in your application. How do you fix this problem? A different (but related) problem is when an update initially executed on one master database server replicated to a slave and got stuck on the slave but executed fine elsewhere. What happened? Which master server got the initial query? This sort of debugging is very difficult to track down without more information such as where the query was initially sent and from what page it originated.

The primary challenge is figuring out which query came from what page in your application. The solution is to add logging straight into your queries. The implemented looks something like this:

//Get the current page or script file
$source = $_SERVER['REQUEST_URI'] ? $_SERVER['REQUEST_URI'] : $_SERVER['SCRIPT_FILENAME'];

//Replace out any comment tags and add in the database being connected to
$metaData = str_replace(array('/*', '*/'), array('/ *', '* /'), $source) . " ($databaseHost)");

//Escape the query so the URI can't be used to inject data
$metaData = mysql_real_escape_string($metaData);
//Execute the query
$result = mysql_query("/* $metaData */ " . $query, $connection);

This solution inserts a comment into your query that gives you useful information that can be seen when looking at the raw query. MySQL uses C++ style comment blocks (the /* */) which are ignored by the parsing engine. This means you can pass data to the engine which can be useful for debugging. These comments are also replicated down to the slaves, which can be useful when you find a slave having problems with a query that came from a master server. For those of you unaware, the “URI” refers to the full URL that was typed in the address bar to access a page.

But make sure that you correctly sanitize the URI so that somebody can’t arbitrarily end your comment block (with a */) and inject their own nonsense into your query. Also, considering issues like multi-byte character attacks, I don’t even want to take the risk of not further escaping the data with a call to mysql_real_escape_string.

The solution we use at my work logs the web server IP, database server IP, and script path/URI. Other potential ideas are local timestamps, version information, user IDs, and session IDs.

In conclusion, this solution will help you identify the source (and sometimes the destination) of queries that are causing problems. This has been used in our production environment at work often when trying to determine what pages are producing extremely slow queries. This solution should work with any database, although my example is written for MySQL.

Happy debugging!

BUG: Constructors, Interfaces, and Abstracts Don’t Mix Well

I just discovered a bug today in PHP 5.1 (haven’t confirmed if it was fixed in newer versions). When trying to enforce interface arguments on constructors, PHP behaves unexpectedly. Normally, interfaces allow you to enforce argument counts or types in child class methods, but not with the constructor (and probably destructor).

Crash course on interfaces: An interface lets you as a developer dictate a standard for a class. For example, you might write an interface class for interacting with your class. Then other people who want to interact with your class would “implement” your interface class. This would force their classes to have a certain set of methods, of which you dictate their names and argument counts (and types). This way, your class is always guaranteed these implementer classes have certain key methods. In the real life example, it’s like saying an interface for a Car would have methods like brake($amount), gas($amount), steer($direction), etc, and the User class would be able to have a guaranteed way of interacting with the Car object (i.e., $user->getCar(‘Ferrari’)->steer(‘left’)). Abstract methods exist in abstract classes and are essentially the same thing. Read more about these here and here.

First, here is an example of a typical interface:

class ExampleClass {}

interface TestInterface {
	public function output(ExampleClass $var);
}

class Test implements TestInterface {
	// error, no output() method was defined
}

The following fails too:

class ExampleClass {}

interface TestInterface {
	public function output(ExampleClass $var);
}

class Test implements TestInterface {
	public function output($var) {} // error, wrong argument type
}

Here is the same example but with the __construct method instead:

class ExampleClass {}

interface TestInterface {
	public function __construct(ExampleClass $var);
}

class Test implements TestInterface {
	// error, no __construct() method was defined
}

Up to here, it works as expected. However, if you define the constructor, the __construct method argument datatype/count checks go out the window:

class ExampleClass {}

interface TestInterface {
	public function __construct(ExampleClass $var);
}

class Test implements TestInterface {
	public function __construct() {} // NO ERROR
}

Despite the data types and argument count being off, PHP doesn’t care. Even if I define an argument in the constructor, the datatype check is ignored. So the best you can do is force a __construct() definition to be required, but you can’t dictate its arguments (i.e., interfaces for constructor methods are useless). And finally, for those of you really astute readers:

class ExampleClass {}

abstract class AbstractTest {
	abstract public function __construct(ExampleClass $var);
}

class Test extends AbstractTest {
	public function __construct() {} // NO ERROR
}

This problem produces the SAME results if instead of an interface, abstract methods in an abstract parent class are used.

Unrealistic Expectations in Job Posting

A funny Craigslist post (now deleted):

Web Designer/Programmer Needed
Reply to: job-483872814@craigslist.org
Date: 2007-11-19, 4:51PM EST
Must have specific database and Web-development experience to include in-depth database management and Web design services. Experience with government clients providing extensive data management and document tracking support a plus. Must have strong computer programming skills across a wide range of platforms/software programs to include:

• PHP5
• ColdFusion 5, ColdFusion MX
• Classic ASP, ASP.NET
• Visual Basic, VB.NET
• HTML, DHTML, XML
• CSS
• Adobe Flex
• JavaScript and VBScript
• Zend Studio
• ColdFusion Studio
• Visual Studio .NET
• Dreamweaver MX
• Front Page
• Acrobat PDF
• Content Management
• Photoshop / Image Ready
• Quark Xpress
• Flash MX
• Fireworks MX
• SQL / PL/SQL
• SQL Server 2005
• Access
• Oracle
• Paradox
• Informatica Data Analyzing and Procedural Mapping
• PC (Windows XP)
• MAC (OS X Tiger)
• UNIX
• Novell Netware5
• Internet Information Server 5
• SQL Server
• MS Windows Server 2003
• ColdFusion Server & Administrator
• Microsoft Office User Specialist
• Corel OfficeSuite

Please email resume and links to websites you have created or on which you have collaborated, to be considered.

* Location: Williamsburg Area
* Compensation: Negotiable
* Telecommuting is ok.
* This is a part-time job.
* This is a contract job.
* OK to highlight this job opening for persons with disabilities
* Principals only. Recruiters, please don’t contact this job poster.
* Please, no phone calls about this job!
* Please do not contact job poster about other services, products or commercial interests.

That’s like pretty much every single web technology plus random experience with Corel.