Although there are many things that a Hacker does well design is not one of them.
The key question is, why not?
Partly it's because design doesn't lend itself to the Hacker style of working. For one, design is explicitly creational. The Hacker, on the other hand, enjoys probing an existing system for faults. The goal of design is to create systems that produce the desired output without those faults. Since the job is creating the system there's no system to probe (yet), so the Hacker's favorite activity has nothing on which to operate.
For another, design often involves choosing what to avoid. Put another way, sometimes good design is about choosing what to not do. Is it worth optimizing a given process or is it better to go with the naive implementation? The Hackers instincts run so strongly counter to choosing to not do something that he will often not recognize that such a choice is at hand (let alone make that choice). The consequences of his "hack" live on in infamy until it's undone.
During the design process you inevitably have to make choices. Do you optimize for space or speed? Do you design for caller simplicity or callee simplicity?
When the Hacker encounters such a decision his instincts work against making the design tradeoff in exchange for a more coherent system. The Hacker specializes in discovering system faults and working around them no matter how much the workaround is at odds with the manifest intent in the design of the system. The choice, to the Hacker, appears to be a problem in need of fixing and he's got tried and true methods for fixing problems.
The Sculptor is less likely to see the choice as a problem requiring a workaround. To him it's a chance to make the right tradeoff for the particular problem the system is intended to solve. To the Sculptor it goes without saying that all systems have limitations. He's more interested in having the system do whatever it was intended to do well than in having it do anything that can be done.
For the Sculptor the discovery of a system fault is somewhat less pleasant than it is to the Hacker. The system fault might mean somewhere along the line he made a bad design choice. Many design choices, once made, are very hard (or impossible) to undo once a system is in use. The discovery of a bad design choice is also unpleasant to the Sculptor because he enjoys making good design choices.
Sometimes it isn't obvious which tradeoff is the correct one. This is often a clue that the Sculptor doesn't understand some aspect of the problem domain. Ideally he'll be able to get a better understanding; either by talking with the client, referring to existing documentation, Wikipedia, reference books, etc... Sometimes this isn't possible - the client may not be available, there may be no existing documentation, nothing else to go on. The Sculptor has to make a call. However, because the Sculptor has gotten into the habit of getting to know more about the underlying problem domain, there's a good chance that he's run across something in the problem domain that points to the right decision.
Sometimes the Sculptor has a pretty good idea of the relative costs of the choice but doesn't know how significant either decision is to the client. A choice that's many times more expensive can often be discarded because it turns out that the client is more than happy with either result. The longer the Sculptor has been sculpting the better he gets at recognizing these "decision moments".
For all these reasons, the Hacker's instincts work against good design. Where good design is found there isn't as much need for hacks. Changes to existing systems becomes less about probing, trial and error or making the system work in ways it wasn't intended and more about finding the right pieces in the existing system to assemble or extend just-so to add the needed functionality.
As a code base increases in size good design can mean the difference between shipping a new, nearly bug free feature in 6 weeks instead of 6 months. It can mean the difference between a new employee becoming productive in a few weeks instead of a few months. The systems tend to survive and thrive long after the designer is no longer working on them.
Systems designed by Hackers, on the other hand, are plagued by bugs. Some easily repeatable, some seemingly non-deterministic. They're fixed quickly but seem to crop up again a few months later. The Hacker isn't (usually) intentionally creating a buggy system so that he has something to do; it's just that he isn't as good at design. He doesn't enjoy it nearly as much as he enjoys hacking and so puts commensurately less effort into design.
Systems designed by Hackers tend to accumulate hacks at several levels. At the level of the user interface there may be major discrepancies both internally and with respect to other, similar products. This is the manifestation of the Hacker's instinct to hack the system. He may have come across a user interface need that wasn't directly met by the available user interface widgets. So he baked something entirely from scratch. Ironically, the desire to hack the system often acts as a disincentive to taking advantage of the built-in functionality in a given system; why bother to figure out what's available when all you need is enough to start and you'll hack out the rest?
At the level of coding constructs systems designed by Hackers will often have a lot of copy/paste re-use. Somewhat more experienced hackers may make use of libraries but they'll often have strange quirks. Maybe they throw lots of exceptions during execution. There will be lots of exception handling, even in places where it doesn't make sense. The Hacker may think of this as "defensive programming" but in reality it's another manifestation of the Hacker instinct: try a bunch of things and hope something works. Oddly enough, the Hacker is often proud that he doesn't really understand exactly why his fix works as long as it appears to work (most of the time).