5 comments

  • mike_hearn 2 hours ago
    I don't quite understand how Modolap differs from just asking AI to use any other OLAP engine? Both your website and the github readme just emphasise that it's idiosyncratic and your personal approach, without explaining what that is or why anyone should care.
    • ronfriedhaber 1 hour ago
      Appreciate the feedback. I shall certainly revamp the README; it is rather stale.

      > "how Modolap differs from just asking AI to use any other OLAP engine"

      There presently exist two components, the OLAP query engine and the remote infrastructure service. The service enables systems like Codex (or developers as well) to manage datasets, maintain version control over queries, and offload the computational burden to dedicated machines. This is especially beneficial given the current trend of running agents inside micro-VMs.

      In addition, it is designed with AI usage in mind. There is significant value in co-design. One could argue that models can use Polars or DuckDB just as well, and that there is no room for improvement, but I do not think this is true.

      • bastawhiz 50 minutes ago
        What room for improvement is there?
      • esafak 23 minutes ago
        I don't get the value proposition either; your landing page is underdeveloped. Tracking the query history is trivial. Offloading computation could be done with Polars Cloud or MotherDuck. Can you expand on the "manage datasets" part?
  • zeroxfe 48 minutes ago
    I've done this kind of thing many times with codex and sqlite, and it works very well. It's one prompt that looks something like this:

    - inspect and understand the downloaded data in directory /path/..., then come up with an sqlite data model for doing detailed analytics and ingest everything into an sqlite db in data.sqlite, and document the model in model.md.

    Then you can query the database adhoc pretty easily with codex prompts (and also generate PDF graphs as needed.)

    I typically use the highest reasoning level for the initial prompt, and as I get deeper into the data, continuously improve on the model, indexes, etc., and just have codex handle any data migration.

  • throwaway290 59 minutes ago
    HN data is open? Under what conditions it's distributed?
    • bastawhiz 49 minutes ago
      There's an API link at the bottom of every page.
  • avib99 46 minutes ago
    [dead]
  • benreesman 50 minutes ago
    [dead]