Posts Tagged “tips”
Oct 5, 2017
Inspired by similar articles I have read, I decided to give my two cents on writing scenarios for role-playing games. While I’m far from being an expert in the matter, let alone a half-decent writer, I have written several scenarios that seem to have clicked with some people. These are my principles when writing story-centric scenarios (beware spoilers of most of my scenarios, don’t read if you intend to play them!):
Remember the story is not linear: don’t write as if you were writing a short story. In a way, writing a scenario is writing down your (obsessively detailed) research for a short story. Focus on the mood, possible scenes, characters, general plot, and clues, and improvise the story from there. Example: Gone Girls has a list of characters and possible scenes and locations, but no order is implied, or even that all scenes will happen. Characters are described with their goals and knowledge, and several possible endings are described for reference.
Have a theme/topic for the story: something like family, prejudices, the cost of freedom, or loyalty. A story theme will help you focus while writing, and it will give the scenario a certain consistency. It will also give you ideas for possible scenes or for plot elements, when used literally or metaphorically. And don’t worry if you think the players won’t catch the metaphors: they still give the scenario a certain feel and focus. Example: Suffragettes is about class warfare from a feminist point of view. One of the metaphors is that the protagonists are fighting the patriarchy. And thus, the antagonists are middle- to high-class people who worship a deity they call “Father”, based on Father Dagon.
Know the important NPCs well enough: you should know how your NPCs (non-player characters; anyone who isn’t the protagonists) will react to different situations. It helps to write down a couple of likely situations. Example: Suffragettes (page 5) has a relatively in-depth description of what Florence knows and how she will react in different situations.
Make/get maps of the most important locations: they are handy for consistency, especially if it’s possible there will be an action scene in them. Example: The Cultists has a full map of the prison, even if the players are very unlikely to see it all.
Make a timeline of events: if there are certain things that will happen regardless of what the characters do, make a timeline. Example: Gone Girls has a timeline of events both leading to the beginning of the story, and happening as the story develops.
Treat the scenario as resources and ideas when improvising: in the end, you will have to make up a bunch of the stuff on the spot, and also it’s satisfying to change or make up new elements to adapt the story to whatever the players found interesting, or to incorporate ideas the players give you as the story develops. Example: once, when telling Gone Girls, the idea of making Edward Clarke invincible came up, along with the idea of making him being able to manipulate opponents to the extent of making them kill themselves. This was never part of the original story but made sense that one time and made the ending more dramatic.
Show, don’t tell! Instead of telling the players about certain important things (eg. some character is a racist, some character is lazy, a room is a mess), setup a situation to make that point. Not only is more memorable, but it gives nuance and extra information. Saying “Tom is lazy” is generic and vague, but seeing how Tom still has boxes from when he moved in, a mess of cables all over the floor, and a pile of dirty dishes in the sink, says just how lazy he is, and in which situations. Example: in Suffragettes (page 9), Elise Samson is not simply described as “poor” or “homeless”. Instead, there’s a short sequence in which this is explained through a situation.
And that’s it! I hope you find this list useful. As a bonus tip, if you are writing horror (interactive or not) I recommend you read my summary of the book “Writing Monsters”, and maybe read the actual book, too.
Dec 12, 2016
This is my summary of the book “2k to 10k: Writing Faster, Writing Better, and Writing More of What You Love” by Rachel Aaron. It’s a very short e-book (also available as audio book) with tips for writers. It’s only $0.99 so definitely worth the money and the time if you’re looking for some writing advice and tips.
The book is divided in two parts: the daily process and the background work that allows for efficient writing. The second part is somewhat more subjective and personal and might not apply equally well for everybody.
Part one: Process
Many (competent, even!) writers equate writing quickly with being a hack. The author obviously doesn’t agree, and thinks that the secret of her method is that is removes dead times and waits. The method is based on three requirements. Improving any of the three is a win, but all three is the best.
Knowledge: The most important of all three. Know what you’re writing before you do it. No macro plot stuff, but exchanges in an argument or very rough descriptions. Five minutes is about enough to cover all the writing for a day.
Time: Record your word output per session for a while and figure out patterns. Do you write better/more when you write for at least two or three hours? At home? At the coffee shop? Without internet? In the morning or evening? Once you figure it out, try to make all of your writing sessions be like that.
Enthusiasm: Write stuff that keeps you enthusiastic. If you didn’t enjoy writing it, it’s likely that readers won’t have fun reading it. When planning the writing for the day, try to play the scenes in your head. If there’s any scene that you are not excited about, change it or drop it. Similarly, if you struggle to write one day, reflect on what you’re writing and figure out if you need to change anything. The process should be enjoyable.
Part two: Tips for Plotting, Characters, Editing
Plotting in 5 steps
To decide which book to write, choose an idea from the pool if ideas you have in your notebook, blog, or wherever. Signs to tell if an idea is worth the time/effort required for a novel: you cannot stop thinking about it; it writes itself (related to the previous point); you can see the finished product; and you can easily explain why others would want to read it.
Get Down What You Already Know. Characters, situations, magical systems, settings. Scrivener mentioned as the best thing ever.
The Basics. Start filling out the gaps from the first step, enough to figure out the bare bones of characters (main characters, antagonists and power players), plot (end and beginning, in that order, plus twists, scenes and climaxes you already know of; also the kind of story this will be), and setting (magic system if applicable, basic political system, general feel of places: technology level, culture, power).
Filling In The Holes. You already have the plot beginning, some interesting middle points, and the end. Tips for when you get stuck in page 28. This step is finished when you can write the whole plot, start to finish, without skipped scenes.
Building a Firm Foundation. Make a time line, draw a map, write out who knows what and when, memorise everyone’s particulars, write out a scene list, do a word count estimation, and do a boredom check (go through the whole plot: if some scene is hard to visualise or feels slow, figure out why).
Start Writing! Remember that no matter how carefully you have plotted, the story and/or characters will probably change dramatically.
Characters Who Write Their Own Stories
Characters with agency (that can make decisions that change the direction of the plot) write their own stories. They will help getting from a point in the plot to the next. Examples in pages 36 and 37. Basic character sheet consists of name, age, physical description, what they like, what they hate, and what they want more than anything. It’s filled during step 2 above. The rest of the character development happens as the novel is written, like a braid: this gives easier and better results.
The Story Architect
Most stories follow a three-act structure (Act I, put your characters in a tree; Act II, light the tree on fire; Act III, get your characters out of the tree). Act II is normally the longest. Act III is the climax, the big event. It has a lot of tension, and it shouldn’t be too long because the tension will fade. Don’t forget the resolution at the end: readers need a closure for the characters, enjoy their victory. Does not mean having to end the book happily: the point is tension relief.
The Two Bird Minimum
Scenes should do three things: advance the story, reveal new information and pull the reader forward. Sometimes combining several scenes into one can be interesting and add tension, plus makes the story leaner.
Editing for People Who Hate Editing
Many people dread editing and think they cannot do it, but it’s just a skill that can be improved. Tips on approach:
Change the Way You Think about Editing. The final destination of editing is reader experience: polishing the text so it doesn’t just contain the story, but it’s nice to read, too.
Editing Tools. Three tools to identify the problems the text has: updated scene map (tip: mark types of scenes, like love, main plot, and secondary plot, and make sure their distribution throughout the next is not too uneven), time line (includes important things other characters were doing “off screen”; helps find timing problems, when action too loose or tight, lagging tension, etc), and the to-do list (list of problems you have found).
Actually Editing. Take the to-do list and start fixing. Always biggest/hairiest problems first, never first page to last. Then do a read-through, making a new to-do list (typos and small things can be fixed on the spot), and possibly more read-throughs if the to-do list was big. Finally, read one more time, but from the reader’s POV (tip: use a reading device, not the computer used to write the manuscript). At this point you can involve other people, never before. Remember that involving other people means more rounds of editing. At least three more rounds is normal.
Here you have a pretty compact summary of the book, mostly useful for reference and to get a sense of what the book covers. Note that I skipped the chapter with advice for new writers and some other minor stuff, though. If you like this, go support the author (seriously, it’s just one dollah).
Feb 15, 2012
This is the second part of my unit testing advice. See the first part on this blog.
If you need any introduction you should really read the first part. I’ll just present the other three ideas I wanted to cover.
Focusing on common cases
This consists of testing only/mostly common cases. These tests rarely fail and give a false sense of security. Thus, tests are better when they also include less common cases, as they’re much more likely to break inadvertently. Common cases not only break far less often, but will probably be caught reasonably fast once someone tries to use the buggy code, so testing them has comparatively less value than testing less common cases.
The best example I found was in the
wrap_stringtests. The relevant example was adding the string “A test of string wrapping…”, which wraps not to two lines, but three (the wrapping is done only on spaces, so “wrapping…” is taken as a single unit; in this sense, my test case could have been clearer and use a very long word, instead of a word followed by ellipsis). Most of the cases we’ll deal with will simply wrap a given word in two lines, but wrapping in three must work, too, and it’s much more likely to break if we decide to refactor or rewrite the code in that function, with the intention to keep the functionality intact.
See other examples of this in aa20bce (no tests with more than one consecutive newline, no tests with lines of only non-printable characters), b248b3f (no tests with just dots, no valid cases with more than one consecutive slash, no invalid cases with content other than slashes), 5e771ab (no directories or hidden files), f8ecac5 (invalid hex characters don’t fail, but produce strange behaviour instead; this test actually discovered a bug), 7856643 (broken escaped content) and 87e9f89 (trailing garbage).
Not trying to make the tests fail
This is related to the previous one, but the emphasis is on trying to choose tests that we think will fail (either now or in the future). My impression is that people often fail to do this because they are trying to prove that the code works, which misses the point of testing. The point is trying to prove the code doesn’t work. And hope that you fail at it, if you will.
The only example I could find was in the
strcasecmpendtests. Note how there’s a test that checks that the last three characters of string “abcDEf” (ie. “DEf”) is less than “deg” when compared case-insensitively. That’s almost pointless, because if we made that same comparison case-sensitively (in other words, if the “case” part of the function breaks) the test still passes! Thus it’s much better to compare the strings ”abcdef” and “Deg”.
Addendum: trying to cover all cases in the tests
There’s another problem I wanted to mention. I have seen several times before, although not in the Tor tests. The problem is making complicated tests that try to cover many/all cases. This seems to stem from the idea that having more test cases is good by itself, when actually more tests are only useful when they increase the chances to catch bugs. For example, if you write tests for a “sum” function and you’re already testing
[5, 6, 3, 7], it’s probably pointless to add a test for
[1, 4, 6, 5]. A test that would increase the chances of catching bugs would probably look more like
[-4, 0, 4, 5.6]or
So what’s wrong with having more tests than necessary? The problem is they make the test suite slower, harder to understand at a glance and harder to review. If they don’t contribute anything to the chance of catching bugs anyway, why pay that price? But the biggest problem is when we try to cover so many test cases than the code produces the test data. In this cases, we have all the above problems, plus that the test suite becomes almost as complex as production code. Such tests become much easier to introduce bugs in, harder to follow the flow of, etc. The tests are our safety net, so we should be fairly sure that they work as expected.
And that’s the end of the tips. I hope they were useful :-)
Feb 14, 2012
When reviewing tests written by other people I see patterns in the improvements I would make. As I realise that these “mistakes” are also made by experienced hackers, I thought it would be useful to write about them. The extra push to write about this now was having concrete examples from my recent involvement in Tor, that will hopefully illustrate these ideas.
These ideas are presented in no particular order. Each of them has a brief explanation, a concrete example from the Tor tests, and, if applicable, pointers to other commits that illustrate the same idea. Before you read on, let me explicitly acknowledge that (1) I know that many people know these principles, but writing about them is a nice reminder; and (2) I’m fully aware that sometimes I need that reminder, too.
Edit: see the second part of this blog.
Tests as spec
Tests are more useful if they can show how the code is supposed to behave, including safeguarding against future misunderstandings. Thus, it doesn’t matter if you know the current implementation will pass those tests or that those test cases won’t add more or different “edge” cases. If those test cases show better how the code behaves (and/or could catch errors if you rewrite the code from scratch with a different design), they’re good to have around.
I think the clearest example were the tests for the
eat_whitespace*functions. Two of those functions end in
_no_nl, and they only eat initial whitespace (except newlines). The other two functions eat initial whitespace, including newlines… but also eat comments. The tests from line 2280 on are clearly targeted at the second group, as they don’t really represent an interesting use case for the first. However, without those tests, a future maintainer could have thought that the
_no_nlfunctions were supposed to eat whitespace too, and break the code. That produces confusing errors and bugs, which in turn make people fear touching the code.
See other examples in commits b7b3b99 (escaped ‘%’, negative numbers, %i format string), 618836b (should an empty string be found at the beginning, or not found at all? does “\n” count as beginning of a line? can “\n” be found by itself? what about a string that expands more than one line? what about a line including the “\n”, with and without the haystack having the “\n” at the end?), 63b018ee (how are errors handled? what happens when a %s gets part of a number?), 2210f18 (is a newline only \r\n or \n, or any combination or \r and \n?) and 46bbf6c (check that all non-printable characters are escaped in octal, even if they were originally in hex; check that characters in octal/hex, when they’re printable, appear directly and not in octal).
Boundaries of different kinds are a typical source of bugs, and thus are among the best points of testing we have. It’s also good to test both sides of the boundaries, both as an example and because bugs can appear on both sides (and not necessarily at once!).
The best example are the tor_strtok_r_impl tests (a function that is supposed to be compatible with
strtok_r, that is, it chops a given string into “tokens”, separated by one of the given separator characters). In fact, these extra tests discovered an actual bug in the implementation (ie. an incompatibility with
strtok_r). Those extra tests asked a couple of interesting questions, including “when a string ends in the token separator, is there an empty token in the end?” in the “howdy!” example. This test can also be considered valuable as in “tests as spec”, if you consider that the answer to be above question is not obvious and both answers could be considered correct.
See other examples in commits 5740e0f (checking if
tor_snprintfcorrectly counts the number of bytes, as opposed the characters, when calculating if something can fit in a string; also note my embarrassing mistake of testing
snprintf, and not
tor_snprintf, later in the same commit), 46bbf6c (check that character 21 doesn’t make a difference, but 20 does) and 725d6ef (testing 129 is very good, but even better with 128—or, in this case, 7 and 8).
Testing implementation details
Testing implementation details tends to be a bad idea. You can usually argue you’re testing implementation details if you’re not getting the test information from the APIs provided by whatever you’re testing. For example, if you test some API that inserts data in a database by checking the database directly, or if you test the result of a method call was correct by checking the object’s internals or calling protected/private methods. There are two reasons why this is a bad idea: first, the more implementation details you tests depend on, the less implementation details you can change without breaking your tests; second, your tests are typically less readable because they’re cluttered with details, instead of meaningful code.
The only example I encountered of this in Tor were the compression tests. In this case it wasn’t a big deal, really, but I have seen this before in much worse situations and I feel this illustrates the point well enough. The problem with that deleted line is that it’s not clear what’s it’s purpose (it needs a comment), plus it uses a magic number, meaning if someone ever changes that number by mistake, it’s not obvious if the problem is the code or the test. Besides, we are already checking that the magic number is correct, by calling the
detect_compression_method. Thus, the deleted
memcmpdoesn’t add any value, and makes our tests harder to read. Verdict: delete!
I hope you liked the examples so far. My next post will contain the second half of the tips.
Oct 28, 2008
From time to time I like making panorama pictures. When I started several years ago, Autostitch was really popular, but it didn’t have a Linux version, which sucked. Actually, it still doesn’t. However, it worked under wine, so I just used it via emulation. It was very simple and worked ok.
Sometimes I’d look for alternatives under Linux (if possible, free) and I had seen a tool called Hugin. It looked complicated (at least compared to Autostitch’s select-pictures-hit-ok-there-you-go), and for some reason I never really used it. It probably wasn’t packaged for Debian or something like that.
A couple of days ago, though, I arrived from a trip where I took a couple of panoramas, and Autostitch had a quite suboptimal behaviour: it didn’t recognise one of my panoramas, and some others were completely destroyed perspective-wise. So I decided to give Hugin another go. And boy am I happy with it. It’s very easy to install in Debian, and although I had some problem with the path to
enblend(apparently I had to specify the absolute path to it in preferences), everything worked fine. Selecting the points to join the pictures is not that hard, and actually has one advantage over Autostitch, namely that if it doesn’t recognise your panoramas automatically, you are giving “hints” about which points are the same in other pictures to Hugin, so it will work. Another advantage is that it has several ways of joining the pictures, which solved my second problem with perspective destruction :-)
Apart from the panorama pictures, I also had some videos… and one of them was recorded as “portrait” instead of “landscape”. So I needed a way to rotate the video. Fortunately, that was easy enough with
mencoder(using command-line, though):
mencoder -vop rotate=2 MVI_2352.AVI -ovc lavc -oac copy -o MVI_2352.avi
I found the tip in some thread in Ubuntu forums, and had to look up the values for “rotate” in @mencoder@’s manpage:
0 Rotate by 90 degrees clockwise and flip (default). 1 Rotate by 90 degrees clockwise. 2 Rotate by 90 degrees counterclockwise. 3 Rotate by 90 degrees counterclockwise and flip.
Sep 22, 2008
Today I was playing with GnuPG, trying to add a couple of public keys to an “external” keyring (some random file, not my own keyring). Why? you ask. Well, I was preparing some Debian package containing GPG keys for APT repository signing (like
The point is, I was really confused for quite a bit because, after reading the
gpgmanpage, I was trying things like:
gpg –no-default-keyring –keyring keys.gpg –import … # Wrong!
But that wouldn’t add anything to the
keys.gpg, which I swear I had in the current directory. After a lot of wondering, I realised that
gpginterprets paths for keyrings as relative to…
~/.gnupg, not the current directory. I guess it’s because of security reasons, but I find it really confusing.
The lesson learned, always use
--keyring ./keys.gpgor, better, never use
keys.gpgas filename for external keyrings, but something more explicit and “non-standard” like
Sep 22, 2008
Today I have had a gigantic e-mail spam attack. And by “gigantic” I mean something like one every couple of seconds. It seems to have stopped by now, though (maybe until tomorrow, sigh). However, there is some small tip that I used in the meantime, and I have found it helps me filtering spam so I thought I’d share with you. It’s very simple: ordering by subject instead of by date. Of course, you have to filter your view to only unread messages, but it works surprisingly well.
This is very easy to do in mutt, my mail reader of choice (for personal e-mail; I have found that, at least for work e-mail, Opera’s M2 works quite well too). You just have to limit to unread messages (pressing lowercase “L” and then using “~N” as filter), and then sort by subject (
:set sort=subject). I have even created too “macros” in mutt to switch back and forth between “spam filtering mode” and “normal mode” :
macro index Cs ":set sort=subject<return>l~N<return>" macro index Cq ":set sort=threads<return>lall<return>"
Let’s hope it doesn’t begin again tomorrow