State of survival is a large scale strategy game which has a very big map (Called wilderness) full of online player's settlements.
What I want to replicate for our game are large city maps (Think 2000 X 2000 Tiles). Inside each of these city maps, online players will have their own headquarter building placed. Besides these player buildings, these city maps will also have other types of objects placed on it, such as resource tiles (Food, Lumber, etc.) (Which any player can consume, once a player consumes, the tile's gone and noone else sees it).
I've estimated the total number of objects possible at any given time to be around 15K in one city map.
Now the game has a world map, which is made of up hundreds of such city maps (Think continents on the earth's world map).
Once a city map fills up with enough player buildings (3000 or so), all new online players will start spawning in another city map which is emptier. For this, I'm thinking I'll keep a simple title data which keeps track of the current city number where a newly joined player will first spawn.
Because this game is going to be ultra large scale with millions of players accessing common maps and objects with time (With players having attack and defense abilities ofcourse), I assume the title data/CDN is not a good place for say setting up a JSON full of co-ordinates of these buildings, as I assume even keeping a cloud script to check the JSON will still easily hit API call limits as at any given point a 100/1000/10000 players would read/write to the same JSON, causing problems.
So far I've setup player resources with a cloudscript and internal player data for security and that works well, but moving ahead with the map implementation seems to be a challenge.
Should I look at AWS DynamoDB for storing such large datasets which can be accessed by millions at the same time (Read and write)? Can I use DynamoDB with playfab cloudscripts to let all object locations sync up between all these players easily?