I worked on the @GeoSearch_dev all day to speed up the matching engine. I managed to reduce response times β± by 20%, which is IMHO amazing β‘οΈ. Finally, I find out that when it goes through the web server, there is the same response time π€π±π€―. #toughlife#buildinpublic
This feature took me more time than I expected π€, but the result is worth it. @GeoSearch_dev matching results should be much more accurate. Maybe it will interest some users πͺ. At least, it made me happy to work on it, as it was quite a challenge π #buildinpublic
A new matching feature has been added and successfully deployed to production π. This should significantly improve the @GeoSearch_dev search results for existing geo objects and their aliases and idioms π
This feature took me more time than I expected π€, but the result is worth it. @GeoSearch_dev matching results should be much more accurate. Maybe it will interest some users. At least, it made me happy to work on it, as was quite a challenge π #buildinpublic
A new matching feature has been added and successfully deployed to production π. This should significantly improve the @GeoSearch_dev search results for existing geo objects and their aliases and idioms π
π€ Writing a custom matching engine is quite difficult. Especially if there is only one who works on it π I'm not surprised that MANGA employs thousands of programmers. I have a million ideas with @GeoSearch_dev but only one life π
@GeoSearch_dev API is free for everyone without limits or tracking. You can transform any unstructured text data into meaningful geocoding data. API is available on api.geosearch.dev/docs#/ so why not try it now? Any feedback is welcome. π #buildinpublic
@GeoSearch_dev I quite enjoy working on these features π The best part is that you have several technical domains together. Like GeoSpatial, GeoCoding, Search engines and algorithms. And the fact that I learn something new every day π #buildinpublic
Today again a lot of fun with @FastAPI. I managed to add a couple of new API endpoints for @GeoSearch_dev. Users can easily query π Countries and States. Cities will be added soon. The matching algorithm has been slightly improved for better results ππ #buildinpublic
Finally! π G-Colab / Jupyter Notebooks ready to show how to work with @GeoSearch_dev and how easy is to do an analysis on your twt followers. I've also added a few improvements from feedback. I hope you will like it. More Notebooks will come in the near future π #buildinpublic
I'm preparing some Collab Jupiter notebooks as an example of what you can do with the @GeoSearch_dev service. In this case, the analysis of where the TWT followers come from. Great fun with Pandas Matplotlib and Plotly. Shareable and easy to use. #buildinpublic
The first version of the @GeoSearch_dev API has been successfully deployed and is ready for the first private beta testing users. I will send tomorrow the first batch of invitations to people from the waiting list. The public beta will follow hopefully in 2 weeksπ€ #buildinpublic
I spent half a day with Django Rest Framework and various alternatives, and I finally gave up and went with @FastAPI, which I like for its simplicity and functionality. Then I have built the @GeoSearch_dev API in 15 minutes. The public beta test is approaching π #buildinpublic
I knew it wouldn't be easy with TimeZones, but I had no idea how hard it actually can be. But I managed to get it into @GeoSearch_dev and process timezones data for most places in the world. I also added information about the Region and Sub-region. I'm excited π #buildinpublic
Building a custom search engine from scratch is not easy π€ Actually, it's pretty complicated in general. The more complicated if, for example, it is a specific engine than the one for @GeoSearch_dev, where in the text you search for countries, states or cities π #buildinpublic