Conversation
|
Too many files changed for review. ( |
daphne-cornelisse
left a comment
There was a problem hiding this comment.
Thank you! I don't think it is a good idea to submit more than 1000 binaries to github. Maybe we should put these on huggingface or just add the json files with the script to process them? @eugenevinitsky, what do you think?
Context: these towns are too big, so Aditya cut them up in smaller parts
|
Hmm, yeah it depends on how big the files are in total. We don't want to store many GB on github. But also, are they so big that 1000 is the right size? Also...what happens when you just load the whole map? |
|
There are a lot of open questions here for me. |
|
@Aditya-Gupta26 could you please comment on this? Maybe add a description to the PR with your reasoning process? thank you :) |
|
Question, why is the grid map reset happen / expensive? |
|
The |
|
Yeah, my question is, what part of the entity cost has to be redone? Presumably just the cost of the dynamic entites which is O(100) |
|
Right, if we can loop through only the dynamic entities - it'll be much better. as of now we loop through all entities ( |


This PR adds new Carla towns (11,12,13) to our datasets. There is a slight window size modification in visualize.c to support rendering of large sized maps.