Tuesday, June 28, 2011

Economy of actions

When 4e first came out I immediately noticed that they had upped the importance of managing the economy of actions during combat. This would be the "second economy" with regards to my last discussion on the topic. First, actions themselves were more formalized. In hindsight the "full round action" was sort of odd and the late introduction of the swift action made the mechanics surrounding it often unbalanced. Having a standard, move, minor from the outset provided greater clarity and balance, but also introduced more power to each action. This is the second impact--with more power embedded in an action comes greater importance to use that power prudently.

I designed a character sheet for my very first 4e character to help facilitate the process of maximizing actions. It worked awesomely and helped me quickly select my actions and be effective. My table-mates, though, weren't quite as quick and bogged the game down in trying to find the combination of actions that would be most effective. It was really slow.

I am working on a hypothesis that the game would operate more quickly if actions did more reliable things. Naturally, something will always break the mold and exceptions are part of the fun of D&D, but if most actions fall into reliable routines, it might help people choose by smoothing out the selection process.

Here's the outline:
  • Standard actions. This is the most important of the actions. The majority of the round's output is tied up in the standard action. It is the attack or spell or whatever that *defines* the round and it is enhanced by the other actions. Most standard actions require a check to resolve.
  • Move actions. Move actions provide access and effects. By access, I mean that they often allow you to move, bringing a target into range. Effects can be a bit more broad and is stuff like providing flank, ending the Prone condition, or similar. Because move actions serve a less important function than standard actions, move actions shouldn't jeopardize standard actions.
  • Minor actions. Minor actions should be declarative, but still be a source of power. Minor actions could be powers ("I use Warlock's Curse"), usages of items ("I apply Viper Venom to my blade."), interactive with the environment ("I cut the rope on the chandelier"), or interactive with characters ("I pick his pocket").
This outline is different from the current rules in two key respects. First, 3e and 4e routinely had move actions jeopardize the standard action by making it easy to fail at moving and lose the rest of your turn. That is silly and just encourages people not to move. Second, they try and empower all actions. Someone sometime got the impression that making movement complex would reward skill-based characters in combat. Instead, it just pushed combat towards stand and swing. I think design would be improved and streamlined by acknowledging that the standard action is the main action, and designing options for the other actions to fuel the standard action.

Alright, so that's the general direction I'm headed. What do you think?

The design spiral

Due to the discussion in the magic item thread, I started rethinking how certain basic math decisions would have to change and I immediately found myself back in the design spiral. Every decision is informed by every previous decision and any future decision ripples through and changes everything. Sometimes it's fun, sometimes it's hell, but always, it's the design spiral.

Friday, June 24, 2011

Question on magic items

One of the ideas that I've seen a lot lately is that magic items shouldn't have pluses; they should go back to "being cool." I fully agree with the sentiment because it is how I like to play my games, but it actually goes against my positive system assumptions mantra in that it is easier to make the pluses automatic than it is to pull them out of the math to go back to the old way.

Also, people need stuff. Magic items are stuff and if you can't give out meaningful rewards, players get a little disenchanted. Maybe you have a uniquely mature gaming group, but most people need stuff.


I haven't put too much thought into magic items just yet, but two ideas have generally risen as potential solutions.
  1. Characters receive inherent pluses at a level appropriate to balance the math. Magic items also provide pluses but also other beneficial effects. A character may either use their inherent plus or the plus of the item and gain its beneficial effect.
  2. Characters receive inherent pluses at a level appropriate to balance the math. Magic items increase the effective level and also provide other beneficial effects. So if characters receive an inherent +1 at fifth character level, a +2 weapon would confer that benefit at 3rd level instead of 5th.
Each solution has some immediate problems.

Solution One
There is something awkward about the first solution. The idea that a character is "inherently +3" but only benefits at "+2 and flaming" because he is using a magic sword feels like the sword is somehow a cursed sword. That isn't the intent and maybe this solution could be fixed with just a little PR.

The benefit of it is that it introduces interesting tradeoffs. There will be times when a player could attack at +10 but will instead attack at +9 to benefit from an item. That is sort of neat and it opens up to door to give high level characters who should be wielding +5 weapons a +2 weapon with a litany of benefits. It also builds in an expiration date to weapons so you get to reward players without over powering characters. A +3 weapon replacing a +2 weapon is incrementally only +1.

Solution Two
The second solution first suffers from a vocabulary issue. The +1 level vs. +1 attack dichotomy would have to be replaced, but that is a small issue. The bigger issue is that the benefit is situational. Let's presume that inherent attack bonuses are at every 5th level. A +4 sword provides no benefit, then, to a 5th level character. This will encourage GMs to give higher plus weapons because they want to push characters to that next step.

At the same time, the benefit always scales which means a weapon is always incrementally powerful. That might at first seem like a benefit, but compare it to solution one. Under solution one, we could introduce the Mace of Kings, a +5 mace that is the scepter of the king. When wielded by the young prince (level one fighter) after his father's death, the Mace of Kings hugely empowers him. If the villain (level 20 cleric) steals it away, he was inherently +5 and so gains little. There is a moderating influence to inherent bonuses under solution one that is absent in solution two.

The plus side of solution two is that at different points in character progression, they'll value different things. A +3 weapon is really valuable at level 7 because it brings you up to the +2 tier. But at level 9, you might be willing to swap it out for a more interesting +1 weapon. Variety is fun and this solution promotes it.

Feedback is appreciated--
  1. What do you look for in magic items? 
  2. What did I miss? 
  3. Which of the solutions do you prefer?

Tactical movement

I need some help.

One of the design goals that I discussed in my big list of elements I want in the game is that it should be less grid reliant. Battle grids are really powerful tools but as soon as you presume their presence they become really powerful crutches. A lot of game design presumes their presence and I think the game is the worse for it. A good example is that difficult terrain is two-squares of movement instead of one. It is a simple, quick rule that has an interesting impact on the game. It also destroys imagination because each square, now, is really, really important. If someone wades though a muddy bog, the bog probably has to conform to square dimensions because it matters whether or not that square costs two or one. It is also challenging for a GM, assuming no grid, to have an idea how many squares of difficult terrain a character passes through and almost impossible for a player to be expected to know from a description. Just not a good situation.

So I am trying to distill movement in combat down to its fundamental elements and then build it back up, making sure not to embed the need for a grid with each step. Today all I want to do is figure out what are the most basic movements characters take. I think I got it down to four:
  1. Tactical approach. From outside of a threatened zone, you approach and engage an enemy.
  2. Tactical withdraw. From within a threatened zone, you move away from an enemy.
  3. Tactical move. From outside a threatened zone, you move.
  4. Tactical maneuver. From within a threatened zone, you move and end within the threatened zone.

For each of them, I also took a snapshot of how it changes if the threatened range expands. I merely expanded it to two squares, but the principles would be the same even if it were expanded to three or five or whatever. My idea is to develop simple principles for each type of move that are reliable enough that a grid wouldn't be required to adjudicate. The hope is that by focusing on basic statuses (i.e. adjacent) instead of ranges (i.e. within 2 squares) you can have a lot of the same functionality with a lot less rules. It also makes it more natural to how grid-less combat is played with people focusing on whether someone is adjacent, within range, etc.

To that same end, I plan on making OA relatively rare. Nothing is more annoying than having a player try and take back an action because it is too detrimental, nothing slows the game down more than adding in more attacks, and nothing imbalances combat quicker than adding more attacks. So all in all, they detract a lot. They are neat in that they balance various maneuvers, but there are a lot of ways to add balance.

What I need help with is two fold:
  1. What did I miss? What other types of movement are fundamental to the game and combat?
  2. How many little levers and tools do you think you need? That is, are OAs a good tool? Is reach a good mechanic? Anything similar?

Thursday, June 23, 2011

Adding flavor through talents

One of the things that I loved about 3e classes (and 2e kits) is that they provided a platform to imbue characters with abilities that they otherwise wouldn’t take. I tried to make a “generic class” 3e where all abilities were feats late in my 3e career and pretty quickly discovered that few people were interested in buying “wild empathy” when things like sneak attack were on the table. I don’t blame players; I would probably do the same. But the solutions were to either (a) pile on the prerequisites or (b) make things less powerful. Neither proved to be fun and it helped me understand why abilities are so often siloed into classes—it is an easier method of restricting and allocating abilities.

To that end, I sort of see why 4e basically did away with those flavor powers because they are hard to divvy out. They are, however, still missed and I want them in the game. My working approach is to label them “talents” and assign each class two talents. Characters periodically (presently at each odd character level) select from among their available pool of talents. If you multiclass, you gain access to that classes talents as well. In addition, your race provides a pool of racial talents. As a result, characters have a wide range of talents to select from and can build their character to epitomize any of their classes, their race, or blend all of the above.

Here is the talent table so far:
Talents by class
Class
Nimble
Resolve
Nature
Tactician
Adventurer
Barbarian
X

X


Bard
X



X
Druid

X
X


Fighter



X
X
Knight

X

X

Mage


X

X
Monk
X
X



Priest

X


X
Ranger


X
X

Rogue
X


X


Don’t read too much into class names just yet…

A talent is, in general, weaker than a feat. That isn’t to say that they aren’t powerful and they will certainly have an impact on the character, but they aren’t really game changers. They are flavorful, fun, come up sporadically, and help round out the *feel* of a character. Some quick samples:
  • Everburning torch bearer: You may draw, light, and, if desired, toss a torch as a free action.
  • Trackless step: You do not leave tracks.  The DC to track any party which you are in increases by 5.
  • Pressure points: Gain a +2 bonus to break or burst common items.
  • A healthy gamble: Gain 1d6 hit points.

Admittedly, no one is going to define their character around their talents. They are just a little something extra to let the rules help define a character without complicating the game too much or contributing too greedily to power creep. They're supposed to be fun more than empowering.

There are also higher level talents that require character level of 7+ and I imagine it would be appropriate to add even more at character level 14+ (sort of dividing the 20 levels into three tiers). I also put a little thought into flaws which, if you choose to take one, provide an additional talent. Flaws are much more punitive than a talent is beneficial because of self-selection power creep.

Wednesday, June 22, 2011

Flattening the power curve

I’ve ranted a lot already about the gap between characters at things like skills and various ways to try and close that gap. This is sometimes referred to as “flattening the power curve.” One way that I haven’t really talked about is alternate methods to reward training that aren’t straight numerical bonuses. With the limited expressions of the d20, numerical bonuses quickly differentiate characters enough that it is tough to challenge them both simultaneously.

What if training let you roll 2d20 and take the highest instead of giving +5?

This graph shows the chance to meet or exceed the target number (the left most column) on a d20. The first light-grey column shows that probability on a single d20. The progression shouldn’t surprise anyone. The subsequent columns show the probability of a single d20 meeting or exceeding the target on Xd20.

Hit probability by target and number of rolls
Target
 1 roll
 2 rolls
 3 rolls
 4 rolls
 5 rolls
2
95%
100%
100%
100%
100%
3
90%
99%
100%
100%
100%
4
85%
98%
100%
100%
100%
5
80%
96%
99%
100%
100%
6
75%
94%
98%
100%
100%
7
70%
91%
97%
99%
100%
8
65%
88%
96%
98%
99%
9
60%
84%
94%
97%
99%
10
55%
80%
91%
96%
98%
11
50%
75%
88%
94%
97%
12
45%
70%
83%
91%
95%
13
40%
64%
78%
87%
92%
14
35%
58%
73%
82%
88%
15
30%
51%
66%
76%
83%
16
25%
44%
58%
68%
76%
17
20%
36%
49%
59%
67%
18
15%
28%
39%
48%
56%
19
10%
19%
27%
34%
41%
20
5%
10%
14%
19%
23%

I’ll include the 3d20 to 5d20 in the subsequent analysis, but the most part I am really only thinking about the variation that occurs from 1d20 to 2d20.

We can really see two things going on as we progress from one to two dice. The first is that the bonus is a bell curve with +5% at a target of 2 and 20, but a bonus of +25% in the middle. If we were to examine the incremental bonus (as opposed to percentages), it would look like this:


What we see is that a second die roll is basically identical to training (+5) for the most common DCs in the 8 to 14 range. For most of the game, rolling 2d20 would play identically. Where it differs, though, is at the extremes. Under the current rules, an easy task might be automatic for a trained character and present some modest challenge for an untrained character. For instance, if an untrained character “has to roll a 6,” he has a 75% chance to succeed. The trained character would then “have to roll a 1” and auto succeeds. I agree that the trained character should be better, but some amount of tension is always preferred. Under this system, the trained character “has to roll a 6 on one of two dice” and has a 91% chance to succeed. I prefer that small chance to fail that makes the check still matter.

The second thing we see in the jump from one to two dice is the impact of training. That is, if the first things we discussed is “how likely are you to succeed?”, the second thing is “how much more likely is a trained character to succeed than an untrained character?” That question is answered by dividing the gain by the original likelihood of success. It looks like this:


So against a painfully easy task (2+), a trained character isn’t that much more likely to succeed than an untrained character. It is painfully easy for both of them. Against an average difficulty task (the 8-14 range), a trained character is about 50% more likely to succeed. But against something really hard (20), a trained character is almost twice as likely to succeed.

I haven’t finished tinkering with it, but I think there might be room to develop this to streamline DCs so that the only variable is level instead of level and difficulty. As always, input that helps push this along is appreciated.

Tuesday, June 21, 2011

A second look at the calendar

One of my earliest posts on the site was about how I developed a “system” to make my campaign world’s calendar more useful. The system made the months and their progression more intuitive so that players could learn and employ it more easily. To date it has been a success. This post expands a little upon that system to explain the rest of how the calendar shows up in play. As today marks the summer solstice, a calendar post felt appropriate.

The world has two moons. One a small black moon and the second a large silver moon. This is undoubtedly a holdover from my early love of Krynn, but I’ve always disputed that. The unique thing is that the black moon routinely passes in front of the silver moon to create a halo in the night sky. These haloes mark every equinox and solstice.

I wanted the silver moon to be “the moon” and so its cycle is what the months are predicated on. The black moon, then, is the bonus moon that makes interesting things happen from time to time. As a month is 28 days, the silver moon’s cycle is also 28 days, coming to full each month on the 14th. The black moon has a 42 day cycle (1.5x the silver moon) allowing them to sync up ever three months.



Once it was set, it was easy enough to produce a graphic. You can see by the little symbols when each moon reaches its full and when the haloes occur; each equinox and solstice were then named. I quickly realized, though, that the graphic was (a) a little empty and (b) an interesting place to communicate more celestial information. I decided that much like Earth’s Venus, a distant planet would make a pentagram in a five year cycle in the night sky. I also decided that a second planet, Faero, would move across the sky in a small arc above the horizon during the winter to spring.

In the past, I had produced a range of less successful calendars. Some had dozens of holidays to mirror Earth’s calendar, but it turns out that Earth-workers and families need more holidays than adventurers do. I had calendars that used different conventions for different races, but I found that players of elven characters rarely put in the extra effort to distinguish their understanding from the “common” understanding. In short, the more complex the tool got, the less it was used. This calendar is as simple as I can make it without making it feel trite, and, graciously, players seem to actually pay it credence. That isn’t to say it does a lot (but, honestly, I don’t *want* my calendar to do much), but it does enough. Players wonder if something crazy is going to happen at the halo. They remember the four holidays and look forward to festivals in whatever town they visit. When I drop that Faero is “at its peak,” players that care can recall that it makes a simple arc.

It isn’t much, but it adds enough.

Monday, June 20, 2011

Armor as DR--what it means for the game

Some quick recap.

In building off the introductory post, we see that the effectiveness of the DR depends on the armor damage (increasing the height of the red box), amount of DR (increasing the height of the black bar), monster hit rate (increasing the length of the red box and black bar), and how much defense is lost (increasing the length of the pink box).



The ultimate decision if the armor provided a net gain or net loss is, again, determined by comparing the areas of the black bar to the pink box. The black bar is the damage that would have been taken and was now absorbed by the armor. The pink box is the damage that is now taken, but would have missed but for the reduced defense.

If we calculate DR by average damage, this means that it might be more rewarding (if the monster rolls low on damage) or less rewarding (if the monster rolls high). That variability makes it an interesting tradeoff across all battles. In systems where DR is accrued without a penalty, it isn’t interesting. Gaining 2 DR is always better than zero, and 4 DR is better than 2. Without penalties, you always go for the maximum DR.

Of course, one alternative is to have armor be a penalty in other ways. You can make it expensive to acquire (in feats or in gold), have high armor checks, or reduce speed. These methods are less desirable because they make it hard for low level characters to achieve the archetype of the heavily armored character. We don’t want rule mechanics to get in the way of the game.

Making armor cost a lot to acquire is also worse because once you’ve paid the cost, you are always richly rewarded. It would be like a string of feats that each do nothing, but are prerequisites for a feat that gives +10 attack. The path to the ultimate feat is boring, and once you are there it is similarly boring, just in a different way. We want choices to provide interesting variability as often as possible. Making armor occasionally a bad call is a good (design) call.

But we cannot ignore that different tiers of armor require different investment. How do we distinguish them? Initially, Light armor is set to reduce damage taken from each hit by an average of 1% of total HP at any given level. Medium armor is 2% and Heavy armor is 3%. These percentages are maintained by allowing for magical bonuses to increase the DR granted. At the same time, different armors will have different magical properties available to them with the more powerful properties being reserved for heavier armors.

The defense lost is static and does not go away. Conceivably some feat or class could take some of it away, but I wouldn’t recommend it because then the interesting tradeoff also goes away.

Finally, I briefly mentioned the idea of cutting off armor during battle. These are the neat scenes that make the game memorable; when a knight realizes his foe hits so infrequently but so hard that he’d be better off removing his armor and just dodging. That has awesome potential, but the current rules really don’t support stuff like that. Armor currently takes long enough to don that it isn’t tenable as a “during combat” action. There is no point in writing up rules for stuff that, because of the rules you chose, will never see use. This is Type II Clutter.

Monsters and wounds

Not too long ago I discussed all things hit points which included a subsection about wounds. To recap, wounds occur in two scenarios: first, when you fail a saving throw while dying and, second, when you take damage from a single source greater than your damage threshold. The damage threshold is set so that only really big hits cross it. However, the threshold falls when you are bloodied allowing a wider range of hits to deal wounds. For characters, wounds cost a surge (although you can immediately spend the surge if you have certain powers). For the most part, the identical rules apply to monsters. Today we’ll have a look at what that means for combat.

Monsters have fewer surges and wounds (recall that surges basically equal wounds since a wound counts against max surges) than PCs. This is fair because monsters tend to show up in only a single combat while PCs are expected to keep trudging along. Many monsters, like many PCs, have abilities that trigger when they suffer a wound or upon first becoming bloodied. These abilities allow them to spend the surge instead of it being wasted. For some monsters the ability provides healing and for others it will trigger an attack. Most monsters have two surges but some will have 1 or 3. Elites and solos gain additional surges and also have the ability for their triggered abilities to recharge.

Let’s look at a basic monster with two surges. At first, combat plays as normal. The damage threshold is high enough that it won’t likely be triggered unless a high-damage character scores a hit and rolls high. If that happens, a wound is dealt and the monster can either lose the surge or spend it. Most likely, the monster will spend it, triggering either healing (which it now needs) or delivering a big attack (which is exciting).  The monster now has one surge remaining.

Assuming the monster isn’t bloodied, it probably fights on. It is unlikely that another wound will be dealt before it is bloodied and it has a lot left in it. Eventually, though, the monster’s HP are whittled down and it becomes bloodied, reducing its damage threshold. Now the monster is nervous. It has only one surge, no triggered abilities remaining, and a decent hit might deal a wound. It will consider fleeing unless there is reason to stay (like a boss that will kill it later if it flees). If another wound is delivered, the monster now has 0 surges remaining. Characters with no surges take a -10 penalty to their damage threshold because they are no longer able to fully defend themselves; any hit becomes a big hit. At this point, any solid strike is going to deliver a wound and send the monster into death throes. He flees.

This has three neat impacts on the game. First, it plays close to core 4e, but there is potential to end the battle quicker with well timed big hits. Second, it builds in a mini-morale system that makes sense and other rules (like intimidate) can piggy back on. Third, it facilitates those cinematic moments where the monster is fleeing and the archer lets loose an arrow at 100 paces that kills the foe *without* requiring the foe to be in that narrow window of HP where it works. Now, it works most times that it makes sense for the monster to be fleeing.

You can extrapolate from the mini example above to see how it would change if the monster had 3 surges or 1. It just makes them more or less risk adverse to the rest of the battle, but not until the PCs have either demonstrated their ability to deal big damage or whittled down the monster’s HP. Second, you can make minions just by producing monsters with 0 surges. One big hit kills them, but they continue to fight against many little hits. Any monster can become a minion just by starting it with 0 wounds. Finally, elites and solos, because they get more surges and the ability to recharge, become really exciting combats as you keep pushing them into bloodied (thereby reducing their damage threshold) but then are competing against the clock to deliver wounds before their abilities refresh and they heal above bloodied.

All in all, it is a fairly simple and intuitive change that has a really interesting impact on combat, pushing the tension earlier in the combat, resolving combats quicker, and naturally introducing a morale system that follows simple logic instead of rules that have to be memorized.

Sunday, June 19, 2011

Armor as DR

Armor as DR presents an interesting opportunity that never really worked laid over the top of the game but can find its place if designed in from the outset. There were two main failing with armor as DR that I have encountered in the past. At low levels, armor tends to provide too much DR, making many enemy attacks almost irrelevant. This was largely a problem because low level PCs had so few hit points (one hit die), requiring low level monsters to deal little damage, but DR coming in Light, Medium, and Heavy “tiers” of armor. Second, armor becomes too necessary. When magic armor is granting 10 DR, all characters are overly incentivized to get as heavy of armor as possible. This isn’t fun when it shuts out legitimate character archetypes.

My working solution is to install a tradeoff. Armor grants DR but also lowers Reflex (the AC equivalent). This way, as monster damage increases, the reduction in defense becomes more detrimental at almost the same rate the DR increases.

When DR is acquired without penalty, it is always a no-brainer. Acquire as much as possible. This is because the effectiveness of straight DR is equal to [amount of DR] x [average monster hit rate]. If you have 10 DR and monsters hit 50% of the time, every time you are attacked you can expect an average saving of 5 damage (assuming monster damage is above 10).

The formula when you introduce a tradeoff is somewhat more complex. The armor saved more than it cost when the damage dealt < [ ( [DR x hit rate] / [old hit rate – new hit rate] )  + DR ]. Here it is graphically.


The top bar shows a normal character. The second bar shows the same character with armor that reduces defense by 2 for 1 DR. The area in which the character is damaged extends to be both the red and pink, but the entire band is reduced by the black and grey. As a result, the expected damage is indifferent. What we see is that the real tradeoff lies between the pink and the black. If the area of the black is larger than the area of the pink, the DR provided a net gain. DR, then, is most valuable when the enemies are likely to hit because a longer initial red zone means any DR will automatically provide a longer black bar. Here is a quick example.



If we set Light, Medium, and Heavy armor to -1, -2, and -3 defense, respectively, we can eek out a modest DR progression that makes each of them attractive across levels. This requires an understanding of expected monster hit rates (which I plan to have relatively consistent) and expected monster damage rates (which are also established). Since each level of armor represents a larger investment, I plan to make the payouts for higher armors similarly better.

The nifty thing is that even though over the course of the campaign armor will average out to be better, it might vary battle to battle. If you face a foe that hits rarely but hits hard, you’d be better off without armor and rely on your ability to dodge. The reverse is also true. This creates opportunities for cool scenes where the knight removes his armor before riding into battle, or, better yet, cuts it off during battle.