Your Ad Here

Thursday, February 5, 2009

Clear And Effective Communication In Web Design

Communication is one of the foundational elements of a good website. It is essential for a positive user experience and for a successful website that truly benefits its owners. All types of websites are affected by the need for good communication in one way or another. Regardless of whether the website in question is an e-commerce website, a blog, a portfolio website, an information website for a service company, a government website or any other type of website, there is a significant need to communicate effectively with visitors.



Because of the significance of communication with visitors, it is an essential consideration for every designer and website owner and the responsibility of both. Unfortunately, communication is sometimes overlooked and takes a backseat to the visual attractiveness of a website. Ideally, the design and other elements that do the communicating work together to create a clear, unified message to visitors.



Umbrella Today?


In this article, we'll take a broad look at the subject of clear communication in Web design. We'll start with a discussion of the primary methods of communication for websites and typical challenges that designers face. From there, we'll move on to look at what specifically should be communicated to visitors and tips for implementing this in your own work. At the end, we'll look at some of the goals that should be established in terms of communication when developing websites, as well as some of the results of having a website that communicates effectively.




Read More...

[Source: Smashing Magazine - Posted by Kishore Vengala]
Your Ad Here

Ask SM [CSS/JS]: Pixel Width Decisions, Modal Boxes

This is our second installment of Ask SM, featuring reader questions about Web design, focusing on HTML, CSS and JavaScript. If you have a question about one of these topics, feel free to reach me (Chris Coyier) through one of these methods: send an email to ask [at] smashingmagazine [dot] com with your question, post your question in our forum (you will need to sign up and yes, the forum is not officially launched yet, but it is running!) or, if you have a quick question, just tweet us @smashingmag or @chriscoyier with the tag �[Ask SM].�


Screenshot


Please note: I will do what I can to answer questions, but I certainly won't be able to answer them all. However, posting questions to the forum gives you the best opportunity to get help from the community.




Read More...

[Source: Smashing Magazine - Posted by Kishore Vengala]
Your Ad Here

50 Beautiful And User-Friendly Navigation Menus

Usability is an essential goal of any website, and usable navigation is something every website needs. It determines where users are led and how they interact with the website. Without usable navigation, content becomes all but useless. Menus need to be simple enough for the user to understand, but also contain the elements necessary to guide the user through the website — with some creativity and good design thrown in.



Screenshot



Below we present over 50 excellent navigation menus — we feature CSS-based design solutions, CSS+JavaScript-based menus and Flash-designs. However, they all have something in common: they are user-friendly yet creative and perfectly fit to the style of their respective websites.


Please also consider our previous articles:




Read More...

[Source: Smashing Magazine - Posted by Kishore Vengala]
Your Ad Here

Mastering WordPress Shortcodes

Introduced in WordPress 2.5, shortcodes are powerful but still yet quite unknown WordPress functions. Imagine you could just type "adsense" to display an AdSense ad or "post_count" to instantly find out the number of posts on your blog. WordPress shortcodes can do this and more and will definitely make your blogging life easier. In this article, we'll show you how to create and use shortcodes, as well as provide killer ready-to-use WordPress shortcodes that will enhance your blogging experience.


WordPress Shortcodes



Using shortcodes is very easy. To use one, create a new post (or edit an existing one), switch the editor to HTML mode and type a shortcode in brackets, such as [showcase]. It is also possible to use attributes with shortcodes. A shortcode with attributes would look something like [showcase id="5"]. Shortcodes can also embed content, as shown here: [url href="http://www.smashingmagazine.com"]Smashing Magazine[/url]. Shortcodes are handled by a set of functions introduced in WordPress 2.5 called the Shortcode API. When a post is saved, its content is parsed, and the shortcode API automatically transforms the shortcodes to perform the function they�re intended to perform.



Read More...

[Source: Smashing Magazine - Posted by Kishore Vengala]
Your Ad Here

Ask SM [PHP]: Form Validation, Converting MySQL to XML

PHP and other server-side programming languages are tricky. The manual can be tough to decipher, and there isn’t really a way to “validate” a PHP script. If you’re new to programming, you may lost and not know where to look for help. When I first started programming, I spent hours pulling my hair out, digging through manuals, and poring over books. It wasn’t until I found a great online community that I really started to get in the swing with PHP and felt like I was actually accomplishing something.



PHP-questions



Here at Smashing Magazine, we want to help out PHP programmers who are just getting started or who want to improve their programming chops. Our goal is to support our community by answering their questions and trying to find solutions to their problems.


While Chris Coyier takes care of CSS and JavaScript-related questions, from now on me, Jason Lengstorf, will take care of your PHP- and MySQL-related questions. Posts focused on Ruby, Python, Photoshop and Illustrator are coming as well.



Read More...

[Source: Smashing Magazine - Posted by Kishore Vengala]
Your Ad Here

8 Useful Tips To Become Successful With Twitter

Twitter is the new big thing. With everybody from Britney Spears to Barack Obama now on Twitter, it is safe to say the social networking platform has gone mainstream. For many users worldwide Twitter has become a crucial tool for maintaining contacts, exchanging opinions and making new connections. But what does this mean for the service, and how can we, website owners, actually use it for our purposes?


Twitter


So how do I use Twitter? I guess the first thing to say is that I am not a huge Twitter success story. However, Twitter is turning into the third facet of my online presence, alongside my blog and podcast. With that in mind, let me share a few tips that have helped me better use this interesting new tool.




Read More...

[Source: Smashing Magazine - Posted by Kishore Vengala]
Your Ad Here

The Sad Tragedy of Micro-Optimization Theater


I'll just come right out and say it: I love strings. As far as I'm concerned, there isn't a problem that I can't solve with a string and perhaps a regular expression or two. But maybe that's just my lack of math skills talking.


In all seriousness, though, the type of programming we do on Stack Overflow is intimately tied to strings. We're constantly building them, merging them, processing them, or dumping them out to a HTTP stream. Sometimes I even give them relaxing massages. Now, if you've worked with strings at all, you know that this is code you desperately want to avoid writing:



static string Shlemiel()
{
string result = "";
for (int i = 0; i < 314159; i++)
{
result += getStringData(i);
}
return result;
}


In most garbage collected languages, strings are immutable: when you add two strings, the contents of both are copied. As you keep adding to result in this loop, more and more memory is allocated each time. This leads directly to awful quadradic n2 performance, or as Joel likes to call it, Shlemiel the painter performance.



Who is Shlemiel? He's the guy in this joke:


Shlemiel gets a job as a street painter, painting the dotted lines down the middle of the road. On the first day he takes a can of paint out to the road and finishes 300 yards of the road. "That's pretty good!" says his boss, "you're a fast worker!" and pays him a kopeck.


The next day Shlemiel only gets 150 yards done. "Well, that's not nearly as good as yesterday, but you're still a fast worker. 150 yards is respectable," and pays him a kopeck.


The next day Shlemiel paints 30 yards of the road. "Only 30!" shouts his boss. "That's unacceptable! On the first day you did ten times that much work! What's going on?"


"I can't help it," says Shlemiel. "Every day I get farther and farther away from the paint can!"



This is a softball question. You all knew that. Every decent programmer knows that string concatenation, while fine in small doses, is deadly poison in loops.


But what if you're doing nothing but small bits of string concatenation, dozens to hundreds of times -- as in most web apps? Then you might develop a nagging doubt, as I did, that lots of little Shlemiels could possibly be as bad as one giant Shlemiel.


Let's say we wanted to build this HTML fragment:



stuff

stuff

stuff
stuff



Which might appear on a given Stack Overflow page anywhere from one to sixty times. And we're serving up hundreds of thousands of these pages per day.


Not so clear-cut, now, is it?


So, which of these methods of forming the above string do you think is fastest over a hundred thousand iterations?


1: Simple Concatenation



string s =
@"
" + st() + st() + @"

" + st() + @"

" + st() + "
" + st() + "
";
return s;


2: String.Format



string s =
@"
{0}{1}

{2}

{3}
{4}
";
return String.Format(s, st(), st(), st(), st(), st());


3: string.Concat



string s =
string.Concat(@"
", st(), st(),
@"
", st(),
@"
", st(), "
",
st(), "
");
return s;


4: String.Replace


string s =
@"
{s1}{s2}

{s3}

{s4}
{s5}
";
s = s.Replace("{s1}", st()).Replace("{s2}", st()).
Replace("{s3}", st()).Replace("{s4}", st()).
Replace("{s5}", st());
return s;


5: StringBuilder



var sb = new StringBuilder(256);
sb.Append(@"
");
sb.Append(st());
sb.Append(st());
sb.Append(@"
");
sb.Append(st());
sb.Append(@"
");
sb.Append(st());
sb.Append("
");
sb.Append(st());
sb.Append("
");
return sb.ToString();


Take your itchy little trigger finger off that compile key and think about this for a minute. Which one of these methods will be faster?


Got an answer? Great!


And.. drumroll please.. the correct answer:

 


 


 


 


 


 


 


 


 


 


 


 


 


 


 


 


 


 


 


 


 


 


 


 


 


 


 


 


 


 


 


 


 


 


 


 


 


 


It. Just. Doesn't. Matter!



We already know none of these operations will be performed in a loop, so we can rule out brutally poor performance characteristics of naive string concatenation. All that's left is micro-optimization, and the minute you begin worrying about tiny little optimizations, you've already gone down the wrong path.


Oh, you don't believe me? Sadly, I didn't believe it myself, which is why I got drawn into this in the first place. Here are my results -- for 100,000 iterations, on a dual core 3.5 GHz Core 2 Duo.








1: Simple Concatenation606 ms
2: String.Format665 ms
3: string.Concat587 ms
4: String.Replace979 ms
5: StringBuilder588 ms


Even if we went from the worst performing technique to the best one, we would have saved a lousy 391 milliseconds over a hundred thousand iterations. Not the sort of thing that I'd throw a victory party over. I guess I figured out that using .Replace is best avoided, but even that has some readability benefits that might outweigh the miniscule cost.


Now, you might very well ask which of these techniques has the lowest memory usage, as Rico Mariani did. I didn't get a chance to run these against CLRProfiler to see if there was a clear winner in that regard. It's a valid point, but I doubt the results would change much. In my experience, techniques that abuse memory also tend to take a lot of clock time. Memory allocations are fast on modern PCs, but they're far from free.


Opinions vary on just how many strings you have to concatenate before you should start worrying about performance. The general consensus is around 10. But you'll also read crazy stuff, like this:



Don't use += concatenating ever. Too many changes are taking place behind the scene, which aren?t obvious from my code in the first place. I advise you to use String.Concat() explicitly with any overload (2 strings, 3 strings, string array). This will clearly show what your code does without any surprises, while allowing yourself to keep a check on the efficiency.


Never? Ever? Never ever ever? Not even once? Not even if it doesn't matter? Any time you see "don't ever do X", alarm bells should be going off. Like they hopefully are right now.


Yes, you should avoid the obvious beginner mistakes of string concatenation, the stuff every programmer learns their first year on the job. But after that, you should be more worried about the maintainability and readability of your code than its performance. And that is perhaps the most tragic thing about letting yourself get sucked into micro-optimization theater -- it distracts you from your real goal: writing better code.





[advertisement] Did your buddy just get his ear chewed off for another server crash? Help him out by recommending PA Server Monitor. He just might buy you lunch. Download the Free Trial!



Read More...

[Source: Coding Horror - Posted by Kishore Vengala]

Your Ad Here

Die, You Gravy Sucking Pig Dog!


In the C programming language, you're regularly forced to deal with the painful, dangerous concepts of pointers and explicit memory allocation.



b1 = (double *)malloc(m*sizeof(double));


In modern garbage collected programming languages, life is much simpler; you simply new up whatever object or variable you need.



Double[] b1 = new Double[m];


Use your objects, and just walk away when you're done. The garbage collector will cruise by periodically, and when he sees stuff you're not using any more, he'll clean up behind you and deal with all that nasty pointer and memory allocation stuff on your behalf. It's totally automatic.


Pretty awesome, right? I'd wager the majority of programmers alive today have never once worried about malloc(). I call this progress, as does Jamie Zawinski:



Based on my experience using both kinds of languages, for years at a stretch, I claim that a good garbage collector always beats doing explicit malloc/free in both computational efficiency and programmer time.


However, I also claim that, because of the amount of programmer time that is saved by using GC rather than explicit malloc/free, as well as the dramatic reduction in hard-to-debug storage-management problems, even using a mediocre garbage collector will still result in your ending up with better software faster.


Most of the time, throwing memory and CPU at the problem is still cheaper than throwing programmer time at the problem, even when you multiply the CPUs/memory by the number of users. This isn't true all the time, but it's probably true more often than you think, because Worse is Better.



But even for programmers who have enjoyed automatic garbage collection their whole careers, there are still some.. oddities. See if you can spot one here:



sqlConnection.Close();
sqlConnection.Dispose();
sqlConnection = null;


That is one hellaciously closed database connection. Why don't you take it out back and shoot it, while you're at it?


Even with your friendly neighborhood garbage collector making regular rounds on commodity desktops/servers where many gigabytes of main memory are commonplace, there are still times when you need to release precious resources right now. Not at some unspecified point in the future, whenever the GC gets around to it. Like, say, a database connection. Sure, your database server may be powerful, but it doesn't support an infinitely large number of concurrent connections, either.


The confusing choice between setting an object to null and calling the Dispose method doesn't help matters any. Is it even clear what state the connection is in after Close is called? Could the connection be reused at that point?


Personally, I view explicit disposal as more of an optimization than anything else, but it can be a pretty important optimization on a heavily loaded webserver, or a performance intensive desktop application plowing through gigabytes of data.


Of course, your average obsessive-compulsive developer sees that he's dealing with a semi-precious system resource, and immediately takes matters into his own hands, because he can do a better job than the garbage collector. K. Scott Allen proposes a solution that might mollify both camps in Disposal Anxiety:



What the IDisposable interface needs is a method that promotes self-efficacy in a developer. A method name that can stir up primal urges as the developer types. What we need is a method like the one in BSD?s shutdown.c module.



die_you_gravy_sucking_pig_dog()
{
char *empty_environ[] = { NULL };

syslog(LOG_NOTICE, "%s by %s: %s",
doreboot ? "reboot" : dohalt ? "halt" : dopower ? "power-down" :
"shutdown", whom, mbuf);
(void)sleep(2);

(void)printf("\r\nSystem shutdown time has arrived\007\007\r\n");
if (killflg) {
(void)printf("\rbut you'll have to do it yourself\r\n");
exit(0);
}


Now, I know this function was written back in the days when steam engines still ruled the world, but we could modernize the function by applying some .NET naming standards.


sqlConnection.DieYouGravySuckingPigDog();


Can you feel the passion behind this statement? This statement carries the emotion that is hard to find in today's code. I hope you?ll support this proposal. Good people will be able to sleep at night once again.



So the next time you feel anxious about letting objects fall out of scope, remember: you could always terminate them with extreme prejudice, if you feel it's necessary.


But it probably isn't.






[advertisement] Tired of restoring deleted files? Get PA File Sight and track down the culprit. PA File Sight ? file auditing made easy. Download the Free Trial!



Read More...

[Source: Coding Horror - Posted by Kishore Vengala]

Your Ad Here

You're Doing It Wrong


In The Sad Tragedy of Micro-Optimization Theater we discussed the performance considerations of building a fragment of HTML.



string s =
@"
{0}{1}

{2}

{3}
{4}
";
return String.Format(s, st(), st(), st(), st());


The second act of this particular theater was foreshadowed by Stephen Touset's comment:



The correct answer is that if you're concatenating HTML, you're doing it wrong in the first place. Use an HTML templating language. The people maintaining your code after you will thank you (currently, you risk anything from open mockery to significant property damage).


The performance characteristics of building small string fragments isn't just a red herring -- no, it's far, far worse. The entire question is wrong. This is one of my favorite lessons from The Pragmatic Programmer.



When faced with an impossible problem, identify the real constraints. Ask yourself: "Does it have to be done this way? Does it have to be done at all?"


If our ultimate conclusion was that performance is secondary to readability of code, that's exactly what we should have asked, before doing anything else.


Let's express the same code sample using the standard ASP.NET MVC templating engine. And yes, we render stuff like this all over the place in Stack Overflow. It's the default method of rendering for a reason.



<%= User.ActionTime %>

<%= User.Gravatar %>

<%= User.Details %>
<%= User.Stuff %>



We have a HTML file, through which we poke some holes and insert the data. Simple enough, and conceptually similar to the String.Replace version. Templating works reasonably well in the trivial cases when you have an object with obvious, basic data types in fields that you spit out.


But beyond those simple cases, it's shocking how hairy HTML templating gets. What if you need do to a bit of formatting or processing to get that data into shape before displaying it? What if you need to make decisions and display things differently depending on the contents of those fields? Your once-simple page templates get progressively more and more complex.



<%foreach (var User in Users) { %>
<%= ActionSpan(User)%>

<% if (User.IsAnonymous) { %>
<%= RenderGravatar(User)%>

<%= RepSpan(User)%>
<%= Flair(User)%>

<% } else { %>
anonymous

<% } %>
<% } %>


This is a fairly mild case, but you can see where templating naturally tends toward a frantic, unreadable mish-mash of code and template -- Web Development as Tag Soup. If your HTML templates can't be kept simple, they're not a heck of a lot better than the procedural string building code they're replacing. And this is not an easy thing to stay on top of, in my experience. The daily grind of struggling to keep the templates from devolving into tag soup starts to feel every bit as grotty as all that nasty string work we were theoretically replacing.


Now it's my turn to ask -- why?


I think existing templating solutions are going about this completely backwards. Rather than poking holes in HTML to insert code, we should simply treat HTML as code.


Like so:



foreach (var User in Users)
{
[ActionSpan(User)]

if (User.IsAnonymous)
{

[UserRepSpan(User)]
[UserFlairSpan(User)]

}
else
{
anonymous

}
}


Seamlessly mixing code and HTML, using a minumum of those headache-inducing escape characters. Is this a programming language for a race of futuristic supermen? No. There are languages that can do this right now, today -- where you can stick HTML in the middle of your code. It's already possible using Visual Basic XML Literals, for example.


Visual Basic XML Literals used in an ASP.NET MVC view


Even the hilariously maligned X# has the right core idea. Templating tends to break down because it forces you to treat code and markup as two different and fundamentally incompatible things. We spend all our time awkwardly switching between markup-land and code-land using escape sequences. They're always fighting each other -- and us.


Seeing HTML and code get equal treatment in my IDE makes me realize one thing:


We've all been doing it wrong.





[advertisement] In charge of a mountain of Windows servers? PA Server Monitor to the rescue! Download the Free Trial!



Read More...

[Source: Coding Horror - Posted by Kishore Vengala]

Your Ad Here

Dictionary Attacks 101


Several high profile Twitter accounts were recently hijacked:



An 18-year-old hacker with a history of celebrity pranks has admitted to Monday's hijacking of multiple high-profile Twitter accounts, including President-Elect Barack Obama's, and the official feed for Fox News.


The hacker, who goes by the handle GMZ, told Threat Level on Tuesday he gained entry to Twitter's administrative control panel by pointing an automated password-guesser at a popular user's account. The user turned out to be a member of Twitter's support staff, who'd chosen the weak password "happiness."


Cracking the site was easy, because Twitter allowed an unlimited number of rapid-fire log-in attempts.


"I feel it's another case of administrators not putting forth effort toward one of the most obvious and overused security flaws," he wrote in an IM interview. "I'm sure they find it difficult to admit it."



If you're a moderator or administrator it is especially negligent to have such an easily guessed password. But the real issue here is the way Twitter allowed unlimited, as-fast-as-possible login attempts.


Given the average user's password choices -- as documented by Bruce Schneier's analysis of 34,000 actual MySpace passwords captured from a phishing attack in late 2006 -- this is a pretty scary scenario.


myspace-phishing-password-statistics-character-sets


myspace-phishing-password-statistics-length


Based on this data, the average MySpace user has an 8 character alphanumeric password. Which isn't great, but doesn't sound too bad. That is, until you find out that 28 percent of those alphanumerics were all lowercase with a single final digit -- and two-thirds of the time that final digit was 1!


Yes, brute force attacks are still for dummies. Even the typically terrible MySpace password -- eight character all lowercase, ending in 1, would require around 8 billion login attempts:



26 x 26 x 26 x 26 x 26 x 26 x 26 x 1 = 8,031,810,176


At one attempt per second, that would take more than 250 years. Per user!


But a dictionary attack, like the one used in the Twitter hack? Well, that's another story. The entire Oxford English Dictionary contains around 171,000 words. As you might imagine, the average person only uses a tiny fraction of those words, by some estimates somewhere between 10 and 40 thousand. At one attempt per second, we could try every word in the Oxford English Dictionary in slightly less than two days.


Clearly, the last thing you want to do is give attackers carte blanche to run unlimited login attempts. All it takes is one user with a weak password to provide attackers a toehold in your system. In Twitter's case, the attackers really hit the jackpot: the user with the weakest password happened to be a member of the Twitter administrative staff.


Limiting the number of login attempts per user is security 101. If you don't do this, you're practically setting out a welcome mat for anyone to launch a dictionary attack on your site, an attack that gets statistically more effective every day the more users you attract. In some systems, your account can get locked out if you try and fail to log in a certain number of times in a row. This can lead to denial of service attacks, however, and is generally discouraged. It's more typical for each failed login attempt to take longer and longer, like so:








1st failed loginno delay
2nd failed login2 sec delay
3rd failed login4 sec delay
4th failed login8 sec delay
5th failed login16 sec delay


And so on. Alternately, you could display a CAPTCHA after the fourth attempt.


There are endless variations of this technique, but the net effect is the same: attackers can only try a handful of passwords each day. A brute force attack is out of the question, and a broad dictionary attack becomes impractical, at least in any kind of human time.


It's tempting to blame Twitter here, but honestly, I'm not sure they're alone. I forget my passwords a lot. I've made at least five or six attempts to guess my password on multiple websites and I can't recall ever experiencing any sort of calculated delay or account lockouts. I'm reasonably sure the big commercial sites have this mostly figured out. But since every rinky-dink website on the planet demands that I create unique credentials especially for them, any of them could be vulnerable. You better hope they're all smart enough to throttle failed logins -- and that you're careful to use unique credentials on every single website you visit.


Maybe this was less of a problem in the bad old days of modems, as there were severe physical limits on how fast data could be transmitted to a website, and how quickly that website could respond. But today, we have the one-two punch of naive websites running on blazing fast hardware, and users with speedy broadband connections. Under these conditions, I could see attackers regularly achieving up to two password attempts per second.


If you thought of dictionary attacks as mostly a desktop phenomenon, perhaps it's time to revisit that assumption. As Twitter illustrates, the web now offers ripe conditions for dictionary attacks. I urge you to test your website, or any websites you use -- and make sure they all have some form of failed login throttling in place.





[advertisement] Tired of restoring deleted files? Get PA File Sight and track down the culprit. PA File Sight ? file auditing made easy. Download the Free Trial!



Read More...

[Source: Coding Horror - Posted by Kishore Vengala]

Your Ad Here

Overnight Success: It Takes Years


Paul Buchheit, the original lead developer of GMail, notes that the success of GMail was a long time in coming:



We starting working on Gmail in August 2001. For a long time, almost everyone disliked it. Some people used it anyway because of the search, but they had endless complaints. Quite a few people thought that we should kill the project, or perhaps "reboot" it as an enterprise product with native client software, not this crazy Javascript stuff. Even when we got to the point of launching it on April 1, 2004 -- two and a half years after starting work on it -- many people inside of Google were predicting doom. The product was too weird, and nobody wants to change email services. I was told that we would never get a million users.


Once we launched, the response was surprisingly positive, except from the people who hated it for a variety of reasons. Nevertheless, it was frequently described as "niche", and "not used by real people outside of silicon valley".


Now, almost 7 1/2 years after we started working on Gmail, I see [an article describing how Gmail grew 40% last year, compared to 2% for Yahoo and -7% for Hotmail].



Paul has since left Google and now works at his own startup, FriendFeed. Many industry insiders have not been kind to FriendFeed. Stowe Boyd even went so far as to call FriendFeed a failure. Paul takes this criticism in stride:



Creating an important new product generally takes time. FriendFeed needs to continue changing and improving, just as Gmail did six years ago. FriendFeed shows a lot of promise, but it's still a "work in progress".


My expectation is that big success takes years, and there aren't many counter-examples (other than YouTube, and they didn't actually get to the point of making piles of money just yet). Facebook grew very fast, but it's almost 5 years old at this point. Larry and Sergey started working on Google in 1996 -- when I started there in 1999, few people had heard of it yet.


This notion of overnight success is very misleading, and rather harmful. If you're starting something new, expect a long journey. That's no excuse to move slow though. To the contrary, you must move very fast, otherwise you will never arrive, because it's a long journey! This is also why it's important to be frugal -- you don't want to starve to death halfway up the mountain.



Stowe Boyd illustrated his point about FriendFeed with a graph comparing Twitter and FriendFeed traffic. Allow me to update Mr. Boyd's graph with another data point of my own.


twitter vs. friendfeed vs. stackoverflow web traffic


I find Paul's attitude refreshing, because I take the same attitude toward our startup, Stack Overflow. I have zero expectation or even desire for overnight success. What I am planning is several years of grinding through constant, steady improvement.


This business plan isn't much different from my career development plan: success takes years. And when I say years, I really mean it! Not as some cliched regurgitation of "work smarter, not harder." I'm talking actual calendar years. You know, of the 12 months, 365 days variety. You will literally have to spend multiple years of your life grinding away at this stuff, waking up every day and doing it over and over, practicing and gathering feedback each day to continually get better. It might be unpleasant at times and even downright un-fun occasionally, but it's necessary.


This is hardly unique or interesting advice. Peter Norvig's classic Teach Yourself Programming in Ten Years already covered this topic far better than I.



Researchers have shown it takes about ten years to develop expertise in any of a wide variety of areas, including chess playing, music composition, telegraph operation, painting, piano playing, swimming, tennis, and research in neuropsychology and topology. The key is deliberative practice: not just doing it again and again, but challenging yourself with a task that is just beyond your current ability, trying it, analyzing your performance while and after doing it, and correcting any mistakes. Then repeat. And repeat again.


There appear to be no real shortcuts: even Mozart, who was a musical prodigy at age 4, took 13 more years before he began to produce world-class music. The Beatles seemed to burst onto the scene with a string of #1 hits and an appearance on the Ed Sullivan show in 1964. But they had been playing small clubs in Liverpool and Hamburg since 1957, and while they had mass appeal early on, their first great critical success, Sgt. Peppers, was released in 1967.



Honestly, I look forward to waking up someday two or three years from now and doing the exact same thing I did today: working on the Stack Overflow code, eking out yet another tiny improvement or useful feature. Obviously we want to succeed. But on some level, success is irrelevant, because the process is inherently satisfying. Waking up every day and doing something you love -- even better, surrounded by a community who loves it too -- is its own reward. Despite being a metric ton of work.


The blog is no different. I often give aspiring bloggers this key piece of advice: if you're starting a blog, don't expect anyone to read it for six months. If you do, I can guarantee you will be sorely disappointed. However, if you can stick to a posting schedule and produce one or two quality posts every week for an entire calendar year... then, and only then, can you expect to see a trickle of readership. I started this blog in 2004, and it took a solid three years of writing 3 to 5 times per week before it achieved anything resembling popularity within the software development community.


I fully expect to be writing on this blog, in one form or another, for the rest of my life. It is a part of who I am. And with that bit of drama out of the way, I have no illusions: ultimately, I'm just the guy on the internet who writes that blog.


blog comic 255548 full


That's perfectly fine by me. I never said I was clever.


Whether you ultimately achieve readers, or pageviews, or whatever high score table it is we're measuring this week, try to remember it's worth doing because, well -- it's worth doing.


And if you keep doing it long enough, who knows? You might very well wake up one day and find out you're an overnight success.





[advertisement] Who filled the file server with MP3 files again? PA Storage Monitor can tell you. Disk and directory growth reports too. Download the Free Trial!



Read More...

[Source: Coding Horror - Posted by Kishore Vengala]

Your Ad Here

Windows micro Linkfest

One more post to clean out the hopper.

Cheers!

--Claus V.



Read More...

[Source: Grand Stream Dreams - Posted by Kishore Vengala]
Your Ad Here

Windows 7 News Roundup #5

MSDump

CC Photo Credit: by Choctopus on Flickr

We are getting ready to see INKHEART at the movies after having read all three of the books as a family.  Can�t wait!

Until then, here are a truckload of Windows 7 links you might find interesting.

Presented in no particular order.

I�m enjoying my personal explorations of W7 Beta.  So far it is quite stable and seems to accept most Vista/XP compatible applications with few complaints.

Some utilities don�t play well, particularly ones that deal with networking, but overall, it is a nice build and hopefully will overcome most of the issues Vista had during it�s public release.

Besides, Vista already did the hard work getting folks to upgrade their hardware, RAM, and system CPU�s.

Windows 7 looks to be gravy.

--Claus V.



Read More...

[Source: Grand Stream Dreams - Posted by Kishore Vengala]
Your Ad Here

A Microsoft Energy-Saver quick-wash Linkpost

The Valca family is recovering today. image

Lavie has bloomed again after a three-week battle with a nagging flu.  Alvis is recovering from homework and adjusting to having a TV in her own bedroom.

And me?

I�m trying to catch up on blog posting, several hours of DVR recordings, and the regular Sunday laundry offerings.

It�s cloudy outside but warm and cozy inside.

Wash, Rinse, Recycle

  • Process Explorer v11.32 - �This update fixes a bug in the process security page's name resolution and uses history graph tooltips that track the mouse.�

  • Autoruns v9.38 - �This fixes a bug that prevented v9.37 from viewing the system account's profile on 32-bit Windows.�

  • ZoomIt v3.0 - �This major update to ZoomIt, the Sysinternals screen magnification and annotation utility, adds a LiveZoom mode on Windows Vista and higher, allows you to change the typing and break timer font, adds the ability to copy the magnified screen to the clipboard with Ctrl+C, and introduces a new configuration interface.�

  • The Case of the Crashed Phone Call � Mark�s Blog. Mark Russinovich presents a new case where VOIP calls keep crashing David Solomon�s Vista system.  Great troubleshooting exercise.

  • How do I Fix a Corrupted Virtual Hard Disk? - Virtual PC Guy�s WebLog.  Ben Armstrong provides some great information regarding the structure and troubleshooting of VirtualPC VHD (Virtual Hard Disk) files.

  • Cross Platform Sysprep�ing with XP SP3  - David Remy�s �Ping� blog.  David is one of  my prime go-to sources for information and answers with Sysprep.  In this guide, he shows how to deal with cross-core hardware cloning (AMD <�> Intel) deployments with Sysprep.  Not a common situation, but good information to keep handy.

  • Fix for Windows Vista Black Screen of Death, aka KSOD - the back room tech.  Julie does it again with a great find for Vista support staff.  When the black-screen-of-death occurs just after reboot, you are presented with �a black screen with a white mouse cursor and nothing else ever loads (no logon screen, etc). Safe mode does the same thing. Last Known Good configuration and System Restore do not fix it except in rare cases where performing a System Restore to 1 month ago or earlier does��  The fix Julie found involves the off-line editing of the system�s registry, and a particular registry key.

  • Download details: IE App Compat VHD � Microsoft Downloads � I know I posted it before but I�m sticking it here since I keep coming back for it.  MS has updated their free VHD builds of XP and VIsta for IE testing so that these don�t expire until April 09.  I keep these handy for quick and painless testing of software and applications.

  • The Internet Explorer 8 User-Agent String (Updated Edition) � IEBLog � Brief info on how the User-Agent string is presented to web-servers in IE8.

  • IE8 in Windows 7 Beta � IEBLog � Turns out that Windows 7 Beta actually is using a modified version of IE 8 beta.  This post gets into the particulars.

  • Make Microsoft Remote Desktop A Portable App  MakeUseOf.com � We use a Novell remote desktop support product in our shop, and at home I use ShowMyPC.com as a free and easy remote-support solution.  But I do like portable applications, and learning the elements that make it up was interesting, although as a post commenter stated, I�m not sure what purpose this fulfills.

  • RSS-powered Windows 7 desktop slideshows � istartedsomething � Long Zheng dishes up some clever work for W7 and provides us the method (and packages) to serve up RSS image feeds directly to the desktop.  Still hack/beta level work at the moment, Long does show us the possibilities that W7 may offer in the future.

--Claus V.



Read More...

[Source: Grand Stream Dreams - Posted by Kishore Vengala]
Your Ad Here

Double-On Call Duty Linkpost

Yep.  Saturday.  Been a very long week at work with our crack IT team presented with some very challenging system failures, office moves, and ongoing project management.

One of those herding-cats kind of weeks.

This weekend there is a big server migration project and a few very dedicated individuals from our team are guiding the transition on our systems.  Meanwhile the rest of us are on-call over the weekend to respond to local sites if something tanks.  So far, so good.  But having my work systems up all weekend and all the team-leadership engaged has still meant a larger than normal flurry of emails and other-project communications for me. Thus my first on-call duty.

Meanwhile, Lavie has found a hidden reserve of energy and has decided to plan for a rearrangement of the family-room furniture.  So I�ve been happy to provide logistical support for this duty as well.

So while I work double-duty, kick back and raise one for Claus and take a look at this miscellaneous linkage.

Utilities and such

  • PeaZip - (freeware) � Updated to v2.5 this version incorporates a number of optimizations, GUI updates, OS interaction tweaks and other refinements.  There are lots of compressed file managers and I really like this one.  PeaZip also supports the 7-Zip compression format.  For another compatible tool that has a much easier to use interface than 7-Zip, check out jZip as well.

  • NirBlog: Utilities update for 25/01/2009 � Nir Sofer lists the latest tweaks to his awesome tools.

  • RegScanner -  (freeware) � Updated to version 1.75. �RegScanner is a small utility that allows you to scan the Registry, find the desired Registry values that match to the specified search criteria, and display them in one list.� This version adds a new option that shows found items during the scan process.

  • SysExporter - (freeware) � Updated to version 1.50. �SysExporter utility allows you to grab the data stored in standard list-views, tree-views, list boxes, combo boxes, text-boxes, and WebBrowser/HTML controls from almost any application running on your system, and export it to text, HTML or XML file.� This really helps me extract data and information from error boxes or other special window notifications. This version adds the ability to ��locate the desired window simply by dragging the target icon from the SysExporter toolbar into the window that you need to grab the data.�

  • CurrPorts -   (freeware) � Updated to version 1.60. �CurrPorts displays the list of all currently opened TCP/IP and UDP ports on your local computer.� This version adds three new features:
    • Added new column: Window Title (The window title of the process)
    • Added 'Clear All Filters' option.
    • Added 'Include Selected Processes In Filters' option. Allows you to easily filter by selected processes.

  • PasswordFox - (freeware) � Updated to version 1.11. �PasswordFox is a small password recovery tool that allows you to view the user names and passwords stored by Mozilla Firefox Web browser.�  Adds a new option in ��'Select Folders' dialog-box: Remember the folder settings in the next time that you use PasswordFox.�

  • Download ATI Catalyst Drivers 9.1 XP - FileHippo.com � Ah yes, the never ending march of updating the video drivers of a system continues.

  • CrunchBang Linux � I have to confess.  With all the WinPE work I�ve been doing, it has been almost a year since I�ve spend any amount of time working with a desktop-Linux system or LiveCD.  I still reach and use some forensics-specific Linux LiveCD�s but my days of fiddling with DamnSmallLinux or Knoppix have been far and few between.  So the stripped down and light look of this implementation looks pretty nice and attractive to me.

  • A Portable Remote Desktop Connection (mstsc.exe) - the back room tech blog � Julie saw my post about making a portable version of Windows Remote Desktop.  I found it interesting but not practical for my daily remote needs.  Leave it to the ever-clever Julie to find a deployment scenario that makes wonderful use of this trick.

Browser Bits

  • Firefox Showcase � Mozilla Add-ons. This week I was having to monitor multiple network traffic graphs and Firefox doesn�t allow you to do side-by-side windows in a single browser session.  I had used and liked Viamatic foXpose but it isn�t compatible with FF3.x and development appears dead for now.  So I did some searching and found Firefox Showcase.  It has lots of great features.  Besides allowing for display of open tabbed windows in a single view, any of those �thumbnails� can be refreshed or browsed accordingly.  It also supports placement of the tab �thumbnail� views in a sidebar, much like Tab Sidebar. However, Firefox Showcase provides many more features.  Lots of options!  Check it out.

  • Convenience is number one factor in keeping browsers secure - Ars Technica � Information from a limited sample set still provides some neat thoughts.  Firefox seems to be the most quickly updated web-browser by it�s users.  Here�s my thought.  Firefox has an internal self-checking updater. If enabled, as soon as updates are offered and found, the user has the chance to update. Opera�s latest release versions look to now do the same.  Internet Explorer users have to wait for IE to be updated as part of Windows Update policy settings or manual checks via the OS.  Updating will then be much less frequent or used in this case.  I�m not even sure how Apple�s Safari browser updating process works.  Does it �phone-home� for update checks? Is there an internal (manual) way to check for available updates?  Only times I have seen it updated is when I do a seed-version update or it is offered via a Quicktime/iTunes Apple-Software updater utility run.  Chrome at least has an Update version feature that also works automatically (or manually) to protect the user, similar to Firefox. I agree that the easier the developers make a browser to automatically update itself, the more secure it will be for the end-user.

  • AdSweep � clever little tool that helps clean up ad-content in Chrome and Opera.  Works a bit like Firefox�s Ad-block type of extensions.  Installation is a bit more technical as �plug-in� support for Chrome and Opera isn�t quite as seamless as Firefox. However it is a start and not too hard to do.  Spotted via Lifehacker�s AdSweep Blocks Ads in Google Chrome and Opera post.

M-Lab - Google Networking Tools Collection

We have a number of network traffic monitoring tools and resources at our disposal, along with an elite-team of top-tier networking systems specialists.  However things get a bit more dicey when trying to see what is going on outside our routers and local-area networks before we escalate issues up the problem resolution food-chain.  Sure, we can always run a Speedtest but that is pretty limited.

This new Google project partnership, M-Lab, looks like it can provide us a selection of additional tools to see what is going on with the network. Home users could benefit as well.

Data is golden when troubleshooting network issues.

  • Network Diagnostic Tool  - Test your connection speed and receive sophisticated diagnosis of problems limiting speed.

  • Glasnost - Test whether BitTorrent is being blocked or throttled.

  • Network Path and Application Diagnosis  - Diagnose common problems that impact last-mile broadband networks.

  • DiffProbe (coming soon)  - Determine whether an ISP is giving some traffic a lower priority than other traffic.

  • NANO (coming soon)  - Determine whether an ISP is degrading the performance of a certain subset of users, applications, or destinations.

Prepare to wait a while before some of these tests kick off. They look pretty popular at the moment.

Supporting information and details from other technical locations.

--Claus V.



Read More...

[Source: Grand Stream Dreams - Posted by Kishore Vengala]
Your Ad Here

U3 USB Thumb Drive

Experiencing an issue with a U3 USB thumb drive; the Dell Optiplex GX280 restarts whenever the drive is removed from the machine - it restarts if ejected, safely removed or simply pulled. Other USB drives that do not have the U3 feature are not experiencing this issue. OS is WinXP.

Info about the u3 Drives: They store your programs and passwords and offer other security features. And Im curious if the U3 drive is supposed to restart the machine out of an abundance of caution.

Thank you for your help

Read More...

[Source: Webmaster Forum - Posted by Kishore Vengala]
Your Ad Here

Vista's User Account Control prompts

Hello. Most people find the UAC prompts annoying, however I do not. I find them a good idea. Although they do prompt you to "Continue". Is there a way to make it request a password/fingerprint? They way it is currently configured anyone can select "Continue" without realizing the consequences...thus I would like for myself to be the only one capable of making that decision.

Does anyone know of a quick solution to this? Thank you very much.

Read More...

[Source: Webmaster Forum - Posted by Kishore Vengala]
Your Ad Here