Idraluna Archives

The Great Antarctic Hexcrawl pt. 9 - Cartography, Naming Stuff, Diversifying Regions, Markdown

This is my own version of lore24, an admittedly over-ambitious attempt to procedurally generate a 43,000-hex crawl for my homebrewed far-future Antarctica, Antibor. Part 1Part 2Part 3Part 4 and 4bPart 5Part 6 and 6bPart 77b7c Part 8.


I haven't had a new GAH post in a while, mostly because after completing a trial run of the generator scripts there's not much to add except for the slow & relatively boring work of writing new hexfills and expanding & improving existing generators. That said, I've made a few bigger changes, detailed below.

Improving the Regional Maps

I had written R code to generate prefecture maps for insertion into the hex keys. I'm pretty proud of these, but I identified a few problems when I used them in solo playtesting and updated the generator to fix them:

Naming Stuff

I've been reluctant to run the generators on more than a couple prefectures at a time as I want to leave lots of room for future creativity. But I've been roving around naming cities, some dungeons, and interesting landforms. A GMing lesson I learned early on is that being able to casually reference place names (whether or not those places exist beyond a name) goes a long way to making a world feel real. It's not a substitute for writing actual hexes, but it's another way of sprinkling around prompts and cues for later.

Simplifying the Hex Attributes

The amount of information saved for each hex bloated as I experimented with different ways of distilling hexes from my original data. To prepare for the major changes described below, I pared my main .gpkg file down to the following core attributes:

The most significant element left on the cutting room floor is the 'Hidden hexfill' attribute, inspired by the Landmark, Hidden, Secret blog post. It's a cool idea (& I'm a big fan of Anne's blog), but treating 'hidden' as a formal category led to some ugly code contortions and during solo playtesting I found it burdensome to track. Antibor is huge, and having multiple points of interest per hex undercuts the core appeal of a continent-scale hexcrawl. Of course, none of this precludes adding extra distinct features to a hex description or specifying that a regular hexfill entry is hidden in some way; it just saves me having to track two potential types of entry for each hex.

Obsidian Markdown

The problem of syncing between a (huge) GIS file and a key document in LaTeX described in pt. 7 has persisted. QGIS makes it easy to pan around the map, quickly visualize any hex attribute, and edit short fields like name, biome, region, etc. But it's really bad for writing & editing prose. LaTeX is fine for writing, but navigating to a specific hex to do that writing is a huge hassle.

I use Obsidian to write my blog posts and organize most of my personal notes and writing. It has nifty features like hyperlinking to other notes & tracking backlinks, the ability to attach metadata and tags to a note, and the capacity to auto-update references when a file name changes. I have it set up to sync to my phone, so I can edit notes while on errands or a walk, or while watching tv with my wife. Most importantly, it's a clean, no-frills text editor, imo the least annoying writing experience I've ever had.

So I thought, what if I store each hex as an Obsidian note? Hexes would be a lot easier to find & reference, easier to rename, and easier to link to notes with info on regions, cultures, factions, etc. Markdown is also a lot simpler than LaTeX, so writing a script to sync material written in Obsidian to the GIS file should be easier than an equivalent script for the LaTeX file I was using.

I wrote an experimental script to generate a .md file for each hex. The script stores most hex information as yaml front matter metadata which appears in Obsidian like so:

The advantage here is that the yaml format makes it easier to read the data back in when syncing back to QGIS.

The (now simplified) hex contents are simply stored as the contents of the markdown file. The beauty here is that I can adapt my prefecture generation script to simply append the output to the markdown file without having to overwrite anything.

hexes <- vect(file.path(mapdir, 'Full_hexes.gpkg')) %>% 
  mutate(md_name = ifelse(is.na(Name)|Name=="", rowcol, Name)) %>%
  mutate(md_name = ifelse(is.na(Content)|Content=="", paste0(md_name, '-E', md_name)))
names(hexes)
land_hexes <- hexes %>% filter(Biome!='Water')
land_hexes.df <- as.data.frame(land_hexes)
count = 0

for (rowcol_id in land_hexes$rowcol){
  if(is.na(rowcol_id)){next}

  count <- count + 1
  print(rowcol_id)
  cat(paste('-', count, 'of 43459'))
  hex <- as.data.frame(land_hexes[land_hexes$rowcol==rowcol_id,])
  #print(hex)

  #if(is.na(hex$Hexfill)&is.na(hex$Hexfill_hidden)&is.na(hex$Name)){next}  # don't make a file if there's no hexfill, hidden hexfill, or name

  hex_name <- hex$md_name
  mdfile <- file.path(hexdir, paste0('Prefecture ', hex$Prefecture), paste0(hex_name, '.md'))
  if(file.exists(mdfile)){next}  # don't overwrite files!

  ### calculate neighbors
  neighbor_hex_ids <- nearby(land_hexes[land_hexes$rowcol==rowcol_id,], land_hexes, distance = 17600) %>% as.data.frame()
  neighbor_hexes <- land_hexes[neighbor_hex_ids$to_id,] %>% filter(rowcol != rowcol_id) %>% as.data.frame() %>% select(md_name)

  ## generate markdown file
  if(!dir.exists(file.path(hexdir, paste0('Prefecture ', hex$Prefecture)))){
    dir.create(file.path(hexdir, paste0('Prefecture ', hex$Prefecture)))
  }

  ### generate yaml
  cat('---\n', file=mdfile)
  cat('tags:\n  - dnd/antibor/hex\n', file=mdfile, append=T)
  cat('aliases:\n', file=mdfile, append=T)
  cat(paste0('  - ', rowcol_id, '\n'), file=mdfile, append=T)
  if(hex_name != rowcol_id){
    cat(paste0('  - ', hex_name, '\n'), file=mdfile, append=T)
  }
  cat(paste0('Biome: ', hex$Biome, '\n'), file=mdfile, append=T)
  cat(paste0('Elevation: ', hex$Elevation, '\n'), file=mdfile, append=T)
  cat(paste0('Terrain: ', hex$Terrain, '\n'), file=mdfile, append=T)
  if(!is.na(hex$Region)){
    cat(paste0('Region: \"[[', hex$Region, ']]\"\n'), file=mdfile, append=T)
  } else {
    cat(paste0('Region: \n'), file=mdfile, append=T)
  }
  if(!is.na(hex$Subregion)){
    cat(paste0('Subregion: \"[[', hex$Subregion, ']]\"\n'), file=mdfile, append=T)
  } else {
    cat(paste0('Subregion: \n'), file=mdfile, append=T)
  }
  cat(paste0('Prefecture: \"[[', hex$Prefecture, ']]\"\n'), file=mdfile, append=T)
  cat(paste0('Hexfill: ', hex$Hexfill, '\n'), file=mdfile, append=T)
  cat(paste0('Neighbors: \n  - \"[['), file=mdfile, append=T)
  cat(paste(neighbor_hexes$md_name, collapse=']]\"\n  - \"[['), file=mdfile, append=T); cat(']]\"\n', file=mdfile, append=T)
  cat('---\n', file=mdfile, append=T)

  ### generate body
  cat('# Contents\n', file=mdfile, append=T)

  if(!is.na(hex$Contents)){
    cat(hex$Contents, file=mdfile, append=T)
  }

  cat('\n\n', file=mdfile, append=T)
}

This loop is quite slow, but not too bad when run overnight.

To sync back from markdown to QGIS, I load in the markdown file and use the yaml package to parse the metadata header. The result is an R list whose entries I'm able to assign to the appropriate rows of the GIS dataframe. The code is pretty ugly and a real programmer would probably cringe at it, but it (so far) does what I want it to. (Keeping lots of backups just in case).

################################################################################
#### Sync from markdown to geopackage ##########################################
################################################################################

hexes <- vect(file.path(mapdir, 'Full_hexes.gpkg'))

# function to read in a md file
read_md_hex <- function(mdfile){
  lines <- readLines(con = mdfile)
  #print(lines)
  
  yaml_divs <- which(lines=='---')
  yaml_lines <- (yaml_divs[1]+1):(yaml_divs[2]-1)
  yaml_data <- lines[yaml_lines]
  #print(yaml_data)
  yaml_data_parsed <- yaml.load(paste(yaml_data, collapse = '\n'))
  
  hexfill_start <- which(lines=='# Contents')+1
  hexfill_end <- length(lines)
  
  filename <- tools::file_path_sans_ext(basename(mdfile))
  yaml_data_parsed$Name <- ifelse(filename == yaml_data_parsed$aliases[1], NA, filename)
  
  yaml_data_parsed$Hexfill <- paste(lines[hexfill_start:hexfill_end], collapse = '\n')
  
  if (yaml_data_parsed$Hexfill=='\n'){yaml_data_parsed$Hexfill <- NA}
  if (is.null(yaml_data_parsed$Subregion)){yaml_data_parsed$Subregion <- NA}
  
  df_row <- data.frame('rowcol'=yaml_data_parsed$aliases[1],
                       'Name'=yaml_data_parsed$Name,
                       'Biome' = gsub("^\\[\\[|\\]\\]$", "", yaml_data_parsed$Biome),
                       'Elevation' = yaml_data_parsed$Elevation,
                       'Terrain'=yaml_data_parsed$Terrain,
                       'Region'=gsub("^\\[\\[|\\]\\]$", "", yaml_data_parsed$Region),
                       'Subregion'=gsub("^\\[\\[|\\]\\]$", "", yaml_data_parsed$Subregion),
                       'Prefecture'=gsub("^\\[\\[|\\]\\]$", "", yaml_data_parsed$Prefecture),
                       'Content'=yaml_data_parsed$Hexfill)
  
  return(df_row)
}

# loop though md hex data
count = 0
md_hexes.df <- data.frame()
for(prefec_id in unique(hexes$prefecture_id_2)){
  directory <- list.files(file.path(hexdir, paste('Prefecture',prefec_id)), full.names = T)
  
  for (mdfile in directory){
    count <- count+1; cat(paste0(mdfile, '..', count, '\n'))
    md_hexes.df <- md_hexes.df %>% bind_rows(read_md_hex(mdfile) ) 
  }
}

### join result back to gpkg
hexes_updated <- hexes %>% filter(Biome!='Water') %>% select(rowcol, row_index, col_index, Elev_mean, Elev_stdev) %>%
  left_join(md_hexes.df)

hexes_updated <- rbind(hexes_updated, hexes %>% filter(Biome=='Water'))

nrow(hexes)
nrow(hexes_updated)

### Make a list of conflicts!
changed_hexes <- as.data.frame(hexes) %>% select(c('rowcol', 'Name', 'Biome_name', 'Elev_class', 'Rugged_class', 'Region_Name', 'Subregion', 'prefecture_id_2', 'desc', 'desc_h')) %>% 
  left_join(md_hexes.df, by='rowcol') %>%
  filter(
    Name.x!=Name.y|Region.x!=Region.y|Subregion.x!=Subregion.y|Content.x!=Content.y|Prefecture.x!=Prefecture.y
  )

writeVector(hexes_updated, file.path(mapdir, 'Full_hexes.gpkg'), overwrite=T)

Pros & Cons

Writing in Obsidian is genuinely easier than QGIS or LaTeX. I prefer to work off of a pdf when running a game, but Antibor is too unfinished to try to compile in a print layout. Tracking backlinks and auto-updating links when a file's name changes is hugely helpful when renaming a hex or prefecture. And being able to pull up random hexes on my phone to work on while bored at work is really nice.

On the other hand,

In spite of all that, so far the payoff of being able to hand-edit hexes with ease has been worth it.

Graph view at like 10% loaded lol

Diversifying Regions

I added the 'Subregion' hex attribute alluded to above. I'm using it to name any feature larger than a hex but smaller than one of the big, vague cultural regions (e.g. islands, lakes, patches of forest, peninsulas)

Now that I have the ability to procgen all the basic types of hex feature, the task is to use those generators judiciously to avoid a continent full of homogenous random slop. To that end, I'm planning on writing themed, quasi-self-contained mini-hexcrawls in many (hopefully all) of these smaller regions. Since I'm as unlikely to run one of these as I am any other given part of Antibor, I'll stick to terse, Carcosa-esque entries unless I feel particularly inspired.

(Slowly, I'm developing a vision for Antibor as a patchwork of (normal-sized) hexcrawls within a connective tissue of randomly generated points of interest).

Vague theme ideas include:


  1. Specific forests, bays, lakes, islands, etc.

#DIY #GIS #antibor #lore24