"Huh? Isn't that counter-intuitive?", you ask?
I realizing something particular as a result of some experiences and wrote that line down, but to me even it sounds counter-intuitive; so let me hasten to explain.
I realizing something particular as a result of some experiences and wrote that line down, but to me even it sounds counter-intuitive; so let me hasten to explain.
A beginner, for the purposes of this discussion, is somebody who's beginning something. This could be a novice starting to learn a skill; but it could also be an expert who's beginning a new project within his area of expertise.
A a beginner (thus defined) is hindered by the presence of tools and frameworks for two reasons:
- They obscure the what and the how of the problem at hand by hiding it within ( usually for novice beginners).
- They prevent easy exploration of the why of the problem and its solution space through ceremony and preventing access(usually for experienced beginners).
Story#1
My team had just got a bunch of freshers. They were recent engineering graduates (presumably with some exposure to programming) who have passed through the company's training program (which again presumably imparts further such exposure). We, however, found that they couldn't do some simple tasks like write code outside Eclipse. They didn't know how to deploy a web application except through the Eclipse plugin; had never debugged an application via logs and in fact didn't know about webapps as an idea independent from "Tomcat". Their OO concepts were shaky at best, but they had implemented small-but-complete web applications using Tomcat, Struts and Hibernate and passed an EJB exam. When asked to build their study app the same from scratch using a command line, however, they were lost. When asked to build a different application (than the one they'd done) *using Eclipse*, they were similarly lost.
While a large portion of the blame should rightly lie in the teaching methodology (or lack of one), the tools and frameworks too, IMO, should bear some of it. "Deploying" for them meant clicking on the Tomcat icon in Eclipse, so they had no use of knowing what "web.xml" did nor did they know that it was no longer required. The same wizards and menu options that make the life of a practitioner easy actually obscure the underlying process (and why its required) to a beginner.
Story#2
The same group of freshers were slowly getting on track with (re)learning the basics of programming, when I thought it might be a good idea to instill in them at this "early age" the values of Test Driven Development. I immediately checked myself, however, because they'd have to learn JUnit and how to use it. On second thoughts, however, I realized that they didn't HAVE to use Junit or any such framework to do TDD. All they had to do was write a test before writing the actual code, have it fail, write the code and have it pass the test. The test could be code in the same function or in main() or as a series of calls to the program stored as a shell script or as JUnits. All of them are equally valid as "tests". We generally, however, recognize only the last of these as tests. The concept of TDD has been usurped by the concepts of the tools that implement it - to the extent that TDD doesn't seem have a life outside of those tools.
Tools, therefore, seem to be actively scuttling the consumption and adoption of Concepts, even when they were created explicitly for the reason of automating the repeated application of known concepts.
I personally have been struggling with this - the guilt of not doing TDD vs the allure of just seeing working code - especially when I'm beginning something and still feeling the problem and solution spaces out. In some cases, the solution space doesnt have readily available tools (BDD on the browser, anyone?) and in others there are tools but I'm still not ready to commit to them because I dont know what my solution is yet (should I build the parser first to figure out the syntax or the AST interpreter to see how it would run?). My liberation came when I declared that a test will be whatever I call a test for the situation at hand, not what some framework determines to be one. Since then the test-red-code-green-repeat cycle is a much more doable one.
Full Circle
Back to the initial "Huh?" moment from the beginning. Why then do we generally consider tools to be useful - especially for beginners? Tools are generally time-savers. They do one thing and they do it well; and that is their value. They do, however, have an "operating range" in which they're most useful. Below that, they're overkill and above, they're obstructive - as depicted in this highly accurate graph on problem size vs tool effectiveness:
So when we usually talk about tools being useful, we're talking about the useful operating range. Specifically for beginners, tools are solution accelerators at that range. The stories presented here represent the two ends of the spectrum, however, where tools are sub-optimal.
Note: I've glossed over frameworks in this discussion, but the concept is the same; or applicable even more so for frameworks. Frameworks by definition are a standard solution to a common problem, with room for customization so that application specifics can still be implemented. The framework is one because it has a known world view and exposes an interface that allows operations on that world view. The concept of operating range is well-ingrained, therefore; as are those of the limits on either side.So please read "tools/frameworks" wherever you see "tools" below.
So...
Armed with this framework for evaluating tools, we can start asking some interesting questions.
- What is a good tool?
- When are tools not required?
- When are tools required?
- How do we determine the operating range of a tool, then?
- What can we do to use tools more effectively?
- What can tool builders do to make effective tools
Attempts at answers to these questions in part 2 of this article.
No comments:
Post a Comment