Compare commits
4 Commits
master
...
feat/sales
| Author | SHA1 | Date | |
|---|---|---|---|
| 9153494ed7 | |||
| ea7f46ea8a | |||
| 4597611655 | |||
| 26c9563a03 |
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
@@ -0,0 +1,250 @@
|
|||||||
|
---
|
||||||
|
title: Move Detailed Sales Data to DuckDB and Parquet
|
||||||
|
type: refactor
|
||||||
|
status: active
|
||||||
|
date: 2026-04-24
|
||||||
|
---
|
||||||
|
|
||||||
|
# Move Detailed Sales Data to DuckDB and Parquet
|
||||||
|
|
||||||
|
## Overview
|
||||||
|
|
||||||
|
Detailed sales records (orders, charges, line items, refunds) are currently stored in Datomic. Because Datomic is append-only, this high-volume data causes significant storage bloat. We will move these details to Parquet files stored on S3, using DuckDB as the query engine for views and summaries, while keeping the high-level `sales-summaries` in Datomic for ledger calculations.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Problem Frame
|
||||||
|
|
||||||
|
The system stores every individual sale and payment detail in Datomic. While useful for auditing, this data is rarely accessed in detail after a few weeks, yet it permanently increases the Datomic database size. The app needs a "colder" but still queryable storage layer for these details.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Requirements Trace
|
||||||
|
|
||||||
|
- R1. Detailed sales/payment entities must be moved from Datomic to Parquet files on S3.
|
||||||
|
- R2. `sales-summaries` must remain in Datomic to ensure ledger calculations remain performant and stable.
|
||||||
|
- R3. The "Sales Orders" and "Payments" views must continue to function (filtering, sorting, pagination) by querying the Parquet files via DuckDB.
|
||||||
|
- R4. The daily sales summary job must be updated to aggregate data from DuckDB instead of Datomic.
|
||||||
|
- R5. The system must handle "voids" of payments/orders in an immutable file format.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Scope Boundaries
|
||||||
|
|
||||||
|
- **In Scope:**
|
||||||
|
- Implementation of Parquet writer for sales data.
|
||||||
|
- DuckDB integration for reading S3 Parquet files.
|
||||||
|
- Migration of existing detailed data from Datomic to S3.
|
||||||
|
- Updating the summary aggregation job.
|
||||||
|
- **Out of Scope:**
|
||||||
|
- Moving `sales-summaries` out of Datomic.
|
||||||
|
- Implementing a real-time streaming pipeline (sticking to batch/daily flushes).
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Context & Research
|
||||||
|
|
||||||
|
### Relevant Code and Patterns
|
||||||
|
|
||||||
|
- **Production Flow:** `auto-ap.square.core3`, `auto-ap.ezcater.core`, and `auto-ap.routes.ezcater-xls` all produce tagged maps that are currently sent to `dc/transact`.
|
||||||
|
- **Read Flow:** `auto-ap.datomic.sales-orders` and `auto-ap.ssr.payments` perform the current Datomic queries.
|
||||||
|
- **Aggregation:** `auto-ap.jobs.sales-summaries` uses `dc/q` to sum totals for the day.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Key Technical Decisions
|
||||||
|
|
||||||
|
- **Storage Format:** Parquet. It is columnar, highly compressed, and natively supported by DuckDB.
|
||||||
|
- **Storage Location:** AWS S3. This removes the need for a managed database server.
|
||||||
|
- **Query Engine:** DuckDB. It can query Parquet files directly on S3 without importing them into a local database.
|
||||||
|
- **Write Strategy:** Daily Batch. To avoid the "small file problem" in S3/Parquet, data will be buffered (locally or in a staging table) and flushed as one file per day: `s3://bucket/sales-details/YYYY-MM-DD.parquet`.
|
||||||
|
- **Voiding Strategy:** Append-only log. A "void" is simply a new record with the same `external-id` and a `status: voided`. The read query will always select the record with the latest timestamp for a given ID.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Implementation Units
|
||||||
|
|
||||||
|
- U1. **S3 Storage & DuckDB Infrastructure**
|
||||||
|
|
||||||
|
**Goal:** Setup the S3 bucket structure and the DuckDB connection utility.
|
||||||
|
|
||||||
|
**Requirements:** R1, R3
|
||||||
|
|
||||||
|
**Dependencies:** None
|
||||||
|
|
||||||
|
**Files:**
|
||||||
|
- Create: `src/clj/auto_ap/storage/parquet.clj` (DuckDB connection and S3 config)
|
||||||
|
|
||||||
|
**Approach:**
|
||||||
|
- Implement a `with-duckdb` wrapper that initializes DuckDB, loads the `httpfs` extension, and configures S3 credentials.
|
||||||
|
|
||||||
|
**Verification:**
|
||||||
|
- A test that can run a simple `SELECT 1` via DuckDB.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
- U2. **Parquet Writer Implementation**
|
||||||
|
|
||||||
|
**Goal:** Create a service to convert sales maps into Parquet files and upload them to S3.
|
||||||
|
|
||||||
|
**Requirements:** R1
|
||||||
|
|
||||||
|
**Dependencies:** U1
|
||||||
|
|
||||||
|
**Files:**
|
||||||
|
- Modify: `src/clj/auto_ap/storage/parquet.clj`
|
||||||
|
- Test: `test/clj/auto_ap/storage/parquet_test.clj`
|
||||||
|
|
||||||
|
**Approach:**
|
||||||
|
- Implement a `flush-to-parquet` function that takes a collection of maps and uses a library to create the file.
|
||||||
|
- Implement the S3 upload logic.
|
||||||
|
- **Recovery:** Implement a "flush-log" in the local SQLite WAL. Mark records as `flushed: true` only after receiving a successful 200 OK from S3. On startup, the system should check for unflushed records and trigger a retry.
|
||||||
|
|
||||||
|
**Test scenarios:**
|
||||||
|
- Happy path: Write a list of 10 sales orders to a Parquet file and verify it exists on S3.
|
||||||
|
- Error path: Simulate an S3 connection failure during flush and verify that records remain in the local WAL and are successfully flushed on the next attempt.
|
||||||
|
- Edge case: Handle empty data sets without creating empty files.
|
||||||
|
|
||||||
|
**Verification:**
|
||||||
|
- Successful upload of a Parquet file that is readable by an external DuckDB CLI.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
- U3. **Redirect Production Flow**
|
||||||
|
|
||||||
|
**Goal:** Change the Square/EzCater integrations to write to the Parquet writer instead of Datomic.
|
||||||
|
|
||||||
|
**Requirements:** R1
|
||||||
|
|
||||||
|
**Dependencies:** U2
|
||||||
|
|
||||||
|
**Files:**
|
||||||
|
- Modify: `src/clj/auto_ap/square/core3.clj`
|
||||||
|
- Modify: `src/clj/auto_ap/ezcater/core.clj`
|
||||||
|
- Modify: `src/clj/auto_ap/routes/ezcater_xls.clj`
|
||||||
|
|
||||||
|
**Approach:**
|
||||||
|
- Replace `dc/transact` calls for detailed sales/charges with calls to the new `parquet/write` service.
|
||||||
|
- *Note:* Keep the transaction for any related entities that must stay in Datomic (e.g., Client updates).
|
||||||
|
|
||||||
|
**Verification:**
|
||||||
|
- Run a Square import and verify that no new detailed entities appear in Datomic, but a new Parquet file is created.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
- U4. **DuckDB Read Layer for Views**
|
||||||
|
|
||||||
|
**Goal:** Update the "Sales Orders" and "Payments" views to fetch data from DuckDB.
|
||||||
|
|
||||||
|
**Requirements:** R3, R5
|
||||||
|
|
||||||
|
**Dependencies:** U1
|
||||||
|
|
||||||
|
**Files:**
|
||||||
|
- Modify: `src/clj/auto_ap/datomic/sales_orders.clj`
|
||||||
|
- Modify: `src/clj/auto_ap/ssr/payments.clj`
|
||||||
|
- Test: `test/clj/auto_ap/integration/graphql/checks.clj`
|
||||||
|
|
||||||
|
**Approach:**
|
||||||
|
- Replace Datomic `q` and `pull` calls with DuckDB SQL queries.
|
||||||
|
- **Performance:** To optimize pagination, implement a "Metadata Index" file on S3 (or a Datomic entity) that stores the total record count per day. Use this to calculate pagination totals without scanning all Parquet files.
|
||||||
|
- **Deterministic Voids:** Use a combination of `timestamp` and a monotonic `sequence_number` for the `QUALIFY` clause to ensure deterministic results for records updated in the same millisecond.
|
||||||
|
- Map DuckDB result sets back to the existing map formats used by the views to minimize frontend changes.
|
||||||
|
|
||||||
|
**Test scenarios:**
|
||||||
|
- Happy path: List payments for a client across a date range.
|
||||||
|
- Integration: Void a payment in S3 and verify the view shows it as voided.
|
||||||
|
- Performance: Verify pagination totals load in < 200ms using the metadata index.
|
||||||
|
- Edge case: Handle two updates to the same record in the same millisecond and verify the latest sequence number wins.
|
||||||
|
|
||||||
|
**Verification:**
|
||||||
|
- The Payments table in the UI loads correctly and reflects the data in S3.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
- U5. **Update Summary Aggregation Job**
|
||||||
|
|
||||||
|
**Goal:** Update the `sales-summaries` job to calculate totals using DuckDB.
|
||||||
|
|
||||||
|
**Requirements:** R2, R4
|
||||||
|
|
||||||
|
**Dependencies:** U1
|
||||||
|
|
||||||
|
**Files:**
|
||||||
|
- Modify: `src/clj/auto_ap/jobs/sales_summaries.clj`
|
||||||
|
|
||||||
|
**Approach:**
|
||||||
|
- In `get-payment-items`, `get-discounts`, `get-tax`, etc., replace the `dc/q` calls with DuckDB SQL `SUM` and `GROUP BY` queries against the daily Parquet files.
|
||||||
|
- Ensure the results are still written to the `sales-summary` entities in Datomic.
|
||||||
|
|
||||||
|
**Verification:**
|
||||||
|
- Run the `sales-summaries-v2` job and verify that the resulting Datomic summaries match the values in the S3 Parquet files.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
- U6. **Historical Data Migration**
|
||||||
|
|
||||||
|
**Goal:** Move all existing detailed sales data from Datomic to Parquet files.
|
||||||
|
|
||||||
|
**Requirements:** R1
|
||||||
|
|
||||||
|
**Dependencies:** U2
|
||||||
|
|
||||||
|
**Files:**
|
||||||
|
- Create: `src/clj/auto_ap/migration/sales_to_parquet.clj`
|
||||||
|
|
||||||
|
**Approach:**
|
||||||
|
- Write a script that iterates through all historical sales orders and payments in Datomic.
|
||||||
|
- Group them by **Business Date** (the date of the sale, not the transaction date) to ensure consistency with future DuckDB queries.
|
||||||
|
- Write each day's data to the corresponding `YYYY-MM-DD.parquet` file on S3.
|
||||||
|
- Log any records with missing dates to a "dead-letter" file for manual review.
|
||||||
|
|
||||||
|
**Verification:**
|
||||||
|
- Count of records in Datomic vs count of records in S3.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
- U7. **Datomic Cleanup**
|
||||||
|
|
||||||
|
**Goal:** Remove the detailed data from Datomic to reclaim space.
|
||||||
|
|
||||||
|
**Requirements:** R1
|
||||||
|
|
||||||
|
**Dependencies:** U6
|
||||||
|
|
||||||
|
**Files:**
|
||||||
|
- Create: `src/clj/auto_ap/migration/cleanup_sales.clj`
|
||||||
|
|
||||||
|
**Approach:**
|
||||||
|
- Use `[:db/retractEntity ...]` to remove all `#:sales-order`, `#:charge`, and `#:sales-refund` entities.
|
||||||
|
- **Batching:** Perform retractions in batches (e.g., by month) with a cooldown period between batches to avoid excessive Datomic transaction log bloat and performance degradation.
|
||||||
|
- *Safety:* Only run this after verifying U6 and U4.
|
||||||
|
|
||||||
|
**Verification:**
|
||||||
|
- Datomic database size decreases; detailed queries in Datomic return empty, while DuckDB queries return data.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## System-Wide Impact
|
||||||
|
|
||||||
|
- **Interaction graph:** The integration cores now depend on the Parquet/S3 service. The SSR views and Background Jobs now depend on the DuckDB service.
|
||||||
|
- **Error propagation:** S3 downtime will now cause "Sales Orders" views to fail and the Summary Job to fail. We should implement basic retry logic in the DuckDB wrapper.
|
||||||
|
- **State lifecycle risks:** There is a window between the "production" of a sale and the "flush" to Parquet. If the app crashes before a flush, data could be lost. *Mitigation:* Use a small local SQLite file as a write-ahead log for the daily buffer.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Risks & Dependencies
|
||||||
|
|
||||||
|
| Risk | Mitigation |
|
||||||
|
|------|------------|
|
||||||
|
| S3 Latency for Views | Use DuckDB's caching and only query the files for the requested date range. |
|
||||||
|
| Data Loss before Flush | Implement a local SQLite staging file for the current day's data. |
|
||||||
|
| Schema Drift | Use a strict schema for Parquet files; handle missing columns in SQL with `COALESCE`. |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Sources & References
|
||||||
|
|
||||||
|
- Related code: `src/clj/auto_ap/jobs/sales_summaries.clj`
|
||||||
|
- Related code: `src/clj/auto_ap/ssr/payments.clj`
|
||||||
|
- External docs: [DuckDB S3 Integration](https://duckdb.org/docs/extensions/httpfs)
|
||||||
2
parquet-wal/test-type.jsonl
Normal file
2
parquet-wal/test-type.jsonl
Normal file
@@ -0,0 +1,2 @@
|
|||||||
|
{"seq-no":1777103077792,"record":{"id":2}}{"seq-no":1777103077984,"record":{"id":1,"name":"test"}}{"seq-no":1777103126496,"record":{"id":2}}
|
||||||
|
{"seq-no":1777103126692,"record":{"id":1,"name":"test"}}
|
||||||
22
project.clj
22
project.clj
@@ -93,18 +93,14 @@
|
|||||||
|
|
||||||
[hiccup "2.0.0-alpha2"]
|
[hiccup "2.0.0-alpha2"]
|
||||||
|
|
||||||
;; needed for java 11
|
;; needed for java 11
|
||||||
[javax.xml.bind/jaxb-api "2.4.0-b180830.0359"]
|
[javax.xml.bind/jaxb-api "2.4.0-b180830.0359"]
|
||||||
[io.forward/clojure-mail "1.0.8"]
|
[io.forward/clojure-mail "1.0.8"]
|
||||||
[lambdaisland/edn-lines "1.0.10"]]
|
[lambdaisland/edn-lines "1.0.10"]
|
||||||
:managed-dependencies [;; explicit dependencies to get to latest versions for above
|
[org.duckdb/duckdb_jdbc "1.1.0"]
|
||||||
[com.fasterxml.jackson.core/jackson-core "2.12.0"]
|
[org.xerial/sqlite-jdbc "3.45.1.0"]
|
||||||
[com.fasterxml.jackson.core/jackson-databind "2.12.0"]
|
[com.fasterxml.jackson.core/jackson-core "2.12.0"]
|
||||||
[com.fasterxml.jackson.core/jackson-annotations "2.12.0"]
|
[com.fasterxml.jackson.core/jackson-databind "2.12.0"]
|
||||||
[com.fasterxml.jackson.dataformat/jackson-dataformat-cbor "2.12.0"]
|
|
||||||
|
|
||||||
[commons-codec "1.12"]]
|
|
||||||
:plugins [[lein-ring "0.9.7"]
|
|
||||||
[lein-cljsbuild "1.1.5"]
|
[lein-cljsbuild "1.1.5"]
|
||||||
[lein-ancient "0.6.15"]]
|
[lein-ancient "0.6.15"]]
|
||||||
:clean-targets ^{:protect false} ["resources/public/js/compiled" "target"]
|
:clean-targets ^{:protect false} ["resources/public/js/compiled" "target"]
|
||||||
@@ -144,7 +140,7 @@
|
|||||||
[com.bhauman/rebel-readline-cljs "0.1.4" :exclusions [org.clojure/clojurescript]]
|
[com.bhauman/rebel-readline-cljs "0.1.4" :exclusions [org.clojure/clojurescript]]
|
||||||
[javax.servlet/servlet-api "2.5"]]
|
[javax.servlet/servlet-api "2.5"]]
|
||||||
:plugins [[lein-pdo "0.1.1"]]
|
:plugins [[lein-pdo "0.1.1"]]
|
||||||
:jvm-opts ["-Dconfig=config/dev.edn" "-Xms4G" "-Xmx20G" "-XX:-OmitStackTraceInFastThrow"]}
|
:jvm-opts ["-Dconfig=config/dev.edn" "-Xms4G" "-Xmx20G" "-XX:-OmitStackTraceInFastThrow" "-Djava.library.path=/home/noti/.local/lib"]}
|
||||||
|
|
||||||
:uberjar
|
:uberjar
|
||||||
{:java-cmd "/usr/lib/jvm/java-11-openjdk/bin/java"
|
{:java-cmd "/usr/lib/jvm/java-11-openjdk/bin/java"
|
||||||
|
|||||||
@@ -1,171 +1,178 @@
|
|||||||
(ns auto-ap.datomic.sales-orders
|
(ns auto-ap.datomic.sales-orders
|
||||||
(:require
|
(:require
|
||||||
[auto-ap.datomic
|
[auto-ap.storage.parquet :as pq]
|
||||||
:refer [add-sorter-fields-2
|
[auto-ap.time :as atime]
|
||||||
apply-pagination
|
[clj-time.coerce :as coerce]
|
||||||
apply-sort-3
|
|
||||||
conn
|
|
||||||
merge-query
|
|
||||||
pull-id
|
|
||||||
pull-many
|
|
||||||
query2
|
|
||||||
visible-clients]]
|
|
||||||
[clj-time.coerce :as c]
|
|
||||||
[clj-time.core :as time]
|
|
||||||
[clojure.set :as set]
|
[clojure.set :as set]
|
||||||
|
[clojure.string :as str]
|
||||||
[com.brunobonacci.mulog :as mu]
|
[com.brunobonacci.mulog :as mu]
|
||||||
[datomic.api :as dc]
|
[ring.util.codec :as ring-codec]))
|
||||||
[iol-ion.query]))
|
|
||||||
|
|
||||||
(defn <-datomic [result]
|
(defn- payment-methods->charges [pm-str]
|
||||||
(-> result
|
(when (not-empty pm-str)
|
||||||
(update :sales-order/date c/from-date)
|
(mapv (fn [pm] {:charge/type-name pm})
|
||||||
(update :sales-order/charges (fn [cs]
|
(str/split pm-str #","))))
|
||||||
(map (fn [c]
|
|
||||||
(-> c
|
|
||||||
(update :charge/processor :db/ident)
|
|
||||||
(set/rename-keys {:expected-deposit/_charges :expected-deposit})
|
|
||||||
(update :expected-deposit first)))
|
|
||||||
cs)))))
|
|
||||||
|
|
||||||
(def default-read '[:db/id
|
(defn <-row
|
||||||
:sales-order/external-id,
|
"Convert a flat parquet row into the shape consumers expect."
|
||||||
:sales-order/location,
|
[row]
|
||||||
:sales-order/date,
|
(let [pm (:payment-methods row)]
|
||||||
:sales-order/total,
|
(-> row
|
||||||
:sales-order/tax,
|
(set/rename-keys
|
||||||
:sales-order/tip,
|
{:external-id :sales-order/external-id
|
||||||
:sales-order/line-items,
|
:location :sales-order/location
|
||||||
:sales-order/discount,
|
:total :sales-order/total
|
||||||
:sales-order/returns,
|
:tax :sales-order/tax
|
||||||
:sales-order/service-charge,
|
:tip :sales-order/tip
|
||||||
:sales-order/vendor,
|
:discount :sales-order/discount
|
||||||
:sales-order/source,
|
:service-charge :sales-order/service-charge
|
||||||
:sales-order/reference-link,
|
:vendor :sales-order/vendor
|
||||||
{:sales-order/client [:client/name :db/id :client/code]
|
:client-code :sales-order/client-code
|
||||||
:sales-order/charges [
|
:date :sales-order/date
|
||||||
:charge/type-name,
|
:source :sales-order/source
|
||||||
:charge/total,
|
:reference-link :sales-order/reference-link
|
||||||
:charge/tax,
|
:payment-methods :sales-order/payment-methods
|
||||||
:charge/tip,
|
:processors :sales-order/processors
|
||||||
:charge/external-id,
|
:categories :sales-order/categories})
|
||||||
:charge/note,
|
(update :sales-order/date #(some-> % str))
|
||||||
:charge/date,
|
(dissoc :entity-type :_seq-no)
|
||||||
:charge/client,
|
(assoc :sales-order/charges (payment-methods->charges pm)))))
|
||||||
:charge/location,
|
|
||||||
:charge/reference-link,
|
|
||||||
{:charge/processor [:db/ident]} {:expected-deposit/_charges [:db/id]}]}])
|
|
||||||
|
|
||||||
(defn raw-graphql-ids [db args]
|
(defn build-where-clause [args]
|
||||||
(let [visible-clients (set (map :db/id (:clients args)))
|
(let [clauses (keep identity
|
||||||
selected-clients (->> (cond
|
[(when-let [c (:client-code args)]
|
||||||
(:client-id args)
|
(str "external_id.client = '" c "'"))
|
||||||
(set/intersection #{(:client-id args)}
|
(when-let [v (:vendor args)]
|
||||||
visible-clients)
|
(str "external_id.vendor = '" (name v) "'"))
|
||||||
|
(when-let [l (:location args)]
|
||||||
|
(str "location = '" l "'"))])]
|
||||||
|
(when (seq clauses)
|
||||||
|
(str "WHERE " (str/join " AND " clauses)))))
|
||||||
|
|
||||||
|
(defn build-sort-clause [args]
|
||||||
|
(let [sort (or (:sort args) "date")
|
||||||
|
order (or (:order args) "DESC")]
|
||||||
|
(str "ORDER BY " sort " " order)))
|
||||||
|
|
||||||
(:client-code args)
|
(def page-size 100)
|
||||||
(set/intersection #{(pull-id db [:client/code (:client-code args)])}
|
|
||||||
visible-clients)
|
|
||||||
|
|
||||||
:else
|
(defn raw-graphql-ids [args]
|
||||||
visible-clients)
|
(let [start (some-> (:start (:date-range args)) .toString)
|
||||||
(take 10)
|
end (some-> (:end (:date-range args)) (.substring 0 10))
|
||||||
set)
|
limit (or (:limit args) page-size)
|
||||||
_ (mu/log ::selected-clients
|
offset (or (:offset args) 0)]
|
||||||
:selected-clients selected-clients)
|
(when start
|
||||||
query (cond-> {:query {:find []
|
(let [result (pq/get-sales-orders start end
|
||||||
:in ['$ '[?clients ?start-date ?end-date]]
|
{:client (:client-code args)
|
||||||
:where '[[(iol-ion.query/scan-sales-orders $ ?clients ?start-date ?end-date) [[?e _ ?sort-default] ...]]]}
|
:vendor (:vendor args)
|
||||||
:args [db [selected-clients
|
:location (:location args)
|
||||||
(some-> (:start (:date-range args)) c/to-date)
|
:sort (or (:sort args) "date")
|
||||||
(some-> (:end (:date-range args)) c/to-date )]]}
|
:order "DESC"
|
||||||
|
:limit limit
|
||||||
|
:offset offset})]
|
||||||
|
{:ids (mapv #(str (:external-id %)) (:rows result))
|
||||||
|
:rows (:rows result)
|
||||||
|
:count (:count result)}))))
|
||||||
|
|
||||||
(:sort args) (add-sorter-fields-2 {"client" ['[?e :sales-order/client ?c]
|
(defn graphql-results [rows _ids _args]
|
||||||
'[?c :client/name ?sort-client]]
|
(mapv <-row rows))
|
||||||
"location" ['[?e :sales-order/location ?sort-location]]
|
|
||||||
"source" ['[?e :sales-order/source ?sort-source]]
|
|
||||||
"date" ['[?e :sales-order/date ?sort-date]]
|
|
||||||
"total" ['[?e :sales-order/total ?sort-total]]
|
|
||||||
"tax" ['[?e :sales-order/tax ?sort-tax]]
|
|
||||||
"tip" ['[?e :sales-order/tip ?sort-tip]]}
|
|
||||||
args)
|
|
||||||
(:category args)
|
|
||||||
(merge-query {:query {:in ['?category]
|
|
||||||
:where ['[?e :sales-order/line-items ?li]
|
|
||||||
'[?li :order-line-item/category ?category]]}
|
|
||||||
:args [(:category args)]})
|
|
||||||
|
|
||||||
(:processor args)
|
(defn- extract-date-str [v]
|
||||||
(merge-query {:query {:in ['?processor]
|
(when v
|
||||||
:where ['[?e :sales-order/charges ?chg]
|
(cond
|
||||||
'[?chg :charge/processor ?processor]]}
|
(string? v) (if (> (count v) 10) (.substring v 0 10) v)
|
||||||
:args [(keyword "ccp-processor"
|
(instance? org.joda.time.DateTime v) (atime/unparse-local v atime/normal-date)
|
||||||
(name (:processor args)))]})
|
(instance? org.joda.time.LocalDate v) (atime/unparse-local v atime/normal-date)
|
||||||
(:type-name args)
|
(instance? java.util.Date v) (atime/unparse-local (coerce/to-date-time v) atime/normal-date)
|
||||||
(merge-query {:query {:in ['?type-name]
|
(instance? java.time.LocalDate v) (.toString v)
|
||||||
:where ['[?e :sales-order/charges ?chg]
|
:else (str v))))
|
||||||
'[?chg :charge/type-name ?type-name]]}
|
|
||||||
:args [(:type-name args)]})
|
|
||||||
|
|
||||||
(:total-gte args)
|
(defn- get-date [qp k]
|
||||||
(merge-query {:query {:in ['?total-gte]
|
(or (extract-date-str (get qp k))
|
||||||
:where ['[?e :sales-order/total ?a]
|
(extract-date-str (get qp (name k)))))
|
||||||
'[(>= ?a ?total-gte)]]}
|
|
||||||
:args [(:total-gte args)]})
|
|
||||||
|
|
||||||
(:total-lte args)
|
(defn- kw->str [v]
|
||||||
(merge-query {:query {:in ['?total-lte]
|
(when (some? v)
|
||||||
:where ['[?e :sales-order/total ?a]
|
(if (keyword? v) (name v) (str v))))
|
||||||
'[(<= ?a ?total-lte)]]}
|
|
||||||
:args [(:total-lte args)]})
|
|
||||||
|
|
||||||
(:total args)
|
(defn- qp->opts [qp]
|
||||||
(merge-query {:query {:in ['?total]
|
(let [sort-params (:sort qp)
|
||||||
:where ['[?e :sales-order/total ?sales-order-total]
|
sort-key (when (seq sort-params) (-> sort-params first :name))
|
||||||
'[(iol-ion.query/dollars= ?sales-order-total ?total)]]}
|
sort-dir (when (seq sort-params) (-> sort-params first :dir))]
|
||||||
:args [(:total args)]})
|
(cond-> {}
|
||||||
|
(some? (:client-code qp)) (assoc :client (kw->str (:client-code qp)))
|
||||||
|
(some? (:location qp)) (assoc :location (kw->str (:location qp)))
|
||||||
|
(not-empty (:payment-method qp)) (assoc :payment-method (:payment-method qp))
|
||||||
|
(some? (:processor qp)) (assoc :processor (kw->str (:processor qp)))
|
||||||
|
(not-empty (:category qp)) (assoc :category (:category qp))
|
||||||
|
(:total-gte qp) (assoc :total-gte (:total-gte qp))
|
||||||
|
(:total-lte qp) (assoc :total-lte (:total-lte qp))
|
||||||
|
sort-key (assoc :sort sort-key)
|
||||||
|
sort-dir (assoc :order (or sort-dir "DESC"))
|
||||||
|
true (assoc :limit (or (:per-page qp) 25)
|
||||||
|
:offset (or (:start qp) 0)))))
|
||||||
|
|
||||||
true
|
(defn- last-week-range []
|
||||||
(merge-query {:query {:find ['?date '?e]
|
(let [today (java.time.LocalDate/now)
|
||||||
:where ['[?e :sales-order/date ?date]]}}))]
|
end (.toString (.minusDays today 1))
|
||||||
|
start (.toString (.minusDays today 8))]
|
||||||
|
[start end]))
|
||||||
|
|
||||||
(cond->> (query2 query)
|
(defn- default-date-range []
|
||||||
true (apply-sort-3 (assoc args :default-asc? false))
|
(let [[s e] (last-week-range)
|
||||||
true (apply-pagination args))))
|
result (try (pq/get-sales-orders-summary s e) (catch Exception _ nil))]
|
||||||
|
(if (and result (> (:total result) 0))
|
||||||
|
[s e]
|
||||||
|
(let [yesterday (.toString (.minusDays (java.time.LocalDate/of 2024 4 24) 1))
|
||||||
|
week-before (.toString (.minusDays (java.time.LocalDate/of 2024 4 24) 8))]
|
||||||
|
[week-before yesterday]))))
|
||||||
|
|
||||||
(defn graphql-results [ids db _]
|
(defn- qp->date-range [qp]
|
||||||
(let [results (->> (pull-many db default-read ids)
|
(let [[default-start default-end] (default-date-range)]
|
||||||
(group-by :db/id))
|
[(or (get-date qp :start-date)
|
||||||
payments (->> ids
|
(extract-date-str (get-in qp [:date-range :start]))
|
||||||
(map results)
|
default-start)
|
||||||
(map first)
|
(or (get-date qp :end-date)
|
||||||
(mapv <-datomic))]
|
(extract-date-str (get-in qp [:date-range :end]))
|
||||||
payments))
|
default-end)]))
|
||||||
|
|
||||||
(defn summarize-orders [ids]
|
(defn fetch-page-ssr
|
||||||
|
"Fetch sales orders from parquet for the SSR page."
|
||||||
|
[request]
|
||||||
|
(let [qp (:query-params request)
|
||||||
|
raw-qp (some-> (:query-string request)
|
||||||
|
ring-codec/form-decode
|
||||||
|
(->> (into {} (remove (fn [[_ v]] (str/blank? v))))))
|
||||||
|
[start end] (qp->date-range (merge raw-qp qp))
|
||||||
|
opts (qp->opts qp)
|
||||||
|
result (pq/get-sales-orders start end opts)
|
||||||
|
rows (mapv <-row (:rows result))]
|
||||||
|
{:rows rows :count (:count result)}))
|
||||||
|
|
||||||
(let [[total tax] (->>
|
(defn summarize-page-ssr
|
||||||
(dc/q {:find ['(sum ?t) '(sum ?tax)]
|
"Summarize all matching sales orders via parquet."
|
||||||
:with ['?id]
|
[request]
|
||||||
:in ['$ '[?id ...]]
|
(let [qp (:query-params request)
|
||||||
:where ['[?id :sales-order/total ?t]
|
raw-qp (some-> (:query-string request)
|
||||||
'[?id :sales-order/tax ?tax]]}
|
ring-codec/form-decode
|
||||||
(dc/db conn)
|
(->> (into {} (remove (fn [[_ v]] (str/blank? v))))))
|
||||||
ids)
|
[start end] (qp->date-range (merge raw-qp qp))
|
||||||
first)]
|
opts (dissoc (qp->opts qp) :limit :offset :sort :order)]
|
||||||
{:total total
|
(pq/get-sales-orders-summary start end opts)))
|
||||||
:tax tax}))
|
|
||||||
|
(defn summarize-orders [rows]
|
||||||
|
(when (seq rows)
|
||||||
|
(let [total (reduce + 0.0 (map #(or (:sales-order/total %) 0.0) rows))
|
||||||
|
tax (reduce + 0.0 (map #(or (:sales-order/tax %) 0.0) rows))]
|
||||||
|
{:total total
|
||||||
|
:tax tax})))
|
||||||
|
|
||||||
(defn get-graphql [args]
|
(defn get-graphql [args]
|
||||||
(let [db (dc/db conn)
|
(let [{:keys [ids rows count]} (mu/trace ::get-sales-order-ids [] (raw-graphql-ids args))]
|
||||||
{ids-to-retrieve :ids matching-count :count} (mu/trace ::get-sales-order-ids [] (raw-graphql-ids db args))]
|
[(mu/trace ::get-results [] (graphql-results rows ids args))
|
||||||
[(->> (mu/trace ::get-results [] (graphql-results ids-to-retrieve db args)))
|
count
|
||||||
matching-count
|
(summarize-orders rows)]))
|
||||||
(summarize-orders ids-to-retrieve)]))
|
|
||||||
|
|
||||||
(defn summarize-graphql [args]
|
(defn summarize-graphql [args]
|
||||||
(let [db (dc/db conn)
|
(let [{:keys [rows]} (raw-graphql-ids args)]
|
||||||
{ids-to-retrieve :ids matching-count :count} (mu/trace ::get-sales-order-ids [] (raw-graphql-ids db args))]
|
(summarize-orders rows)))
|
||||||
(summarize-orders ids-to-retrieve)))
|
|
||||||
|
|
||||||
@@ -1,6 +1,7 @@
|
|||||||
(ns auto-ap.ezcater.core
|
(ns auto-ap.ezcater.core
|
||||||
(:require
|
(:require
|
||||||
[auto-ap.datomic :refer [conn random-tempid]]
|
[auto-ap.datomic :refer [conn random-tempid]]
|
||||||
|
[auto-ap.storage.parquet :as parquet]
|
||||||
[datomic.api :as dc]
|
[datomic.api :as dc]
|
||||||
[clj-http.client :as client]
|
[clj-http.client :as client]
|
||||||
[venia.core :as v]
|
[venia.core :as v]
|
||||||
@@ -20,35 +21,34 @@
|
|||||||
:body (json/write-str {"query" (v/graphql-query q)})
|
:body (json/write-str {"query" (v/graphql-query q)})
|
||||||
:as :json})
|
:as :json})
|
||||||
:body
|
:body
|
||||||
:data
|
:data))
|
||||||
))
|
|
||||||
|
|
||||||
(defn get-caterers [integration]
|
(defn get-caterers [integration]
|
||||||
(:caterers (query integration {:venia/queries [{:query/data
|
(:caterers (query integration {:venia/queries [{:query/data
|
||||||
[:caterers [:name :uuid [:address [:name :street]]]]}]} )))
|
[:caterers [:name :uuid [:address [:name :street]]]]}]})))
|
||||||
|
|
||||||
(defn get-subscriptions [integration]
|
(defn get-subscriptions [integration]
|
||||||
(->> (query integration {:venia/queries [{:query/data
|
(->> (query integration {:venia/queries [{:query/data
|
||||||
[:subscribers [:id [:subscriptions [:parentId :parentEntity :eventEntity :eventKey]] ]]}]} )
|
[:subscribers [:id [:subscriptions [:parentId :parentEntity :eventEntity :eventKey]]]]}]})
|
||||||
:subscribers
|
:subscribers
|
||||||
first
|
first
|
||||||
:subscriptions))
|
:subscriptions))
|
||||||
|
|
||||||
(defn get-integrations []
|
(defn get-integrations []
|
||||||
(map first (dc/q '[:find (pull ?i [:ezcater-integration/api-key
|
(map first (dc/q '[:find (pull ?i [:ezcater-integration/api-key
|
||||||
:ezcater-integration/subscriber-uuid
|
:ezcater-integration/subscriber-uuid
|
||||||
:db/id
|
:db/id
|
||||||
:ezcater-integration/integration-status [:db/id]])
|
{:ezcater-integration/integration-status [:db/id]}])
|
||||||
:in $
|
:in $
|
||||||
:where [?i :ezcater-integration/api-key]]
|
:where [?i :ezcater-integration/api-key]]
|
||||||
(dc/db conn))))
|
(dc/db conn))))
|
||||||
|
|
||||||
(defn mark-integration-status [integration integration-status]
|
(defn mark-integration-status [integration integration-status]
|
||||||
@(dc/transact conn
|
@(dc/transact conn
|
||||||
[{:db/id (:db/id integration)
|
[{:db/id (:db/id integration)
|
||||||
:ezcater-integration/integration-status (assoc integration-status
|
:ezcater-integration/integration-status (assoc integration-status
|
||||||
:db/id (or (-> integration :ezcater-integration/integration-status :db/id)
|
:db/id (or (-> integration :ezcater-integration/integration-status :db/id)
|
||||||
(random-tempid)))}]))
|
(random-tempid)))}]))
|
||||||
|
|
||||||
(defn upsert-caterers
|
(defn upsert-caterers
|
||||||
([integration]
|
([integration]
|
||||||
@@ -64,14 +64,14 @@
|
|||||||
([integration]
|
([integration]
|
||||||
(let [extant (get-subscriptions integration)
|
(let [extant (get-subscriptions integration)
|
||||||
to-ensure (set (map first (dc/q '[:find ?cu
|
to-ensure (set (map first (dc/q '[:find ?cu
|
||||||
:in $
|
:in $
|
||||||
:where [_ :client/ezcater-locations ?el]
|
:where [_ :client/ezcater-locations ?el]
|
||||||
[?el :ezcater-location/caterer ?c]
|
[?el :ezcater-location/caterer ?c]
|
||||||
[?c :ezcater-caterer/uuid ?cu]]
|
[?c :ezcater-caterer/uuid ?cu]]
|
||||||
(dc/db conn))))
|
(dc/db conn))))
|
||||||
to-create (set/difference
|
to-create (set/difference
|
||||||
to-ensure
|
to-ensure
|
||||||
(set (map :parentId extant)))]
|
(set (map :parentId extant)))]
|
||||||
(doseq [parentId to-create]
|
(doseq [parentId to-create]
|
||||||
(query integration
|
(query integration
|
||||||
{:venia/operation {:operation/type :mutation
|
{:venia/operation {:operation/type :mutation
|
||||||
@@ -94,7 +94,6 @@
|
|||||||
:eventKey 'cancelled}}
|
:eventKey 'cancelled}}
|
||||||
[[:subscription [:parentId :parentEntity :eventEntity :eventKey]]]]]})))))
|
[[:subscription [:parentId :parentEntity :eventEntity :eventKey]]]]]})))))
|
||||||
|
|
||||||
|
|
||||||
#_{:clj-kondo/ignore [:clojure-lsp/unused-public-var]}
|
#_{:clj-kondo/ignore [:clojure-lsp/unused-public-var]}
|
||||||
(defn upsert-ezcater
|
(defn upsert-ezcater
|
||||||
([] (upsert-ezcater (get-integrations)))
|
([] (upsert-ezcater (get-integrations)))
|
||||||
@@ -115,12 +114,11 @@
|
|||||||
|
|
||||||
(defn get-caterer [caterer-uuid]
|
(defn get-caterer [caterer-uuid]
|
||||||
(dc/pull (dc/db conn)
|
(dc/pull (dc/db conn)
|
||||||
'[:ezcater-caterer/name
|
'[:ezcater-caterer/name
|
||||||
{:ezcater-integration/_caterers [:ezcater-integration/api-key]}
|
{:ezcater-integration/_caterers [:ezcater-integration/api-key]}
|
||||||
{:ezcater-location/_caterer [:ezcater-location/location
|
{:ezcater-location/_caterer [:ezcater-location/location
|
||||||
{:client/_ezcater-locations [:client/code]}]}]
|
{:client/_ezcater-locations [:client/code]}]}]
|
||||||
[:ezcater-caterer/uuid caterer-uuid]))
|
[:ezcater-caterer/uuid caterer-uuid]))
|
||||||
|
|
||||||
|
|
||||||
(defn round-carry-cents [f]
|
(defn round-carry-cents [f]
|
||||||
(with-precision 2 (double (.setScale (bigdec f) 2 java.math.RoundingMode/HALF_UP))))
|
(with-precision 2 (double (.setScale (bigdec f) 2 java.math.RoundingMode/HALF_UP))))
|
||||||
@@ -136,125 +134,158 @@
|
|||||||
:else
|
:else
|
||||||
0.07M)]
|
0.07M)]
|
||||||
(round-carry-cents
|
(round-carry-cents
|
||||||
(* commision%
|
(* commision%
|
||||||
0.01M
|
0.01M
|
||||||
(+
|
(+
|
||||||
(-> order :totals :subTotal :subunits )
|
(-> order :totals :subTotal :subunits)
|
||||||
(reduce +
|
(reduce +
|
||||||
0
|
0
|
||||||
(map (comp :subunits :cost) (:feesAndDiscounts (:catererCart order)))))))))
|
(map (comp :subunits :cost) (:feesAndDiscounts (:catererCart order)))))))))
|
||||||
|
|
||||||
(defn ccp-fee [order]
|
(defn ccp-fee [order]
|
||||||
(round-carry-cents
|
(round-carry-cents
|
||||||
(* 0.000299M
|
(* 0.000299M
|
||||||
(+
|
(+
|
||||||
(-> order :totals :subTotal :subunits )
|
(-> order :totals :subTotal :subunits)
|
||||||
(-> order :totals :salesTax :subunits )
|
(-> order :totals :salesTax :subunits)
|
||||||
(reduce +
|
(reduce +
|
||||||
0
|
0
|
||||||
(map (comp :subunits :cost) (:feesAndDiscounts (:catererCart order))))))))
|
(map (comp :subunits :cost) (:feesAndDiscounts (:catererCart order))))))))
|
||||||
|
|
||||||
(defn order->sales-order [{{:keys [timestamp]} :event {:keys [orderItems]} :catererCart :keys [client-code client-location uuid] :as order}]
|
(defn order->sales-order [{{:keys [timestamp]} :event {:keys [orderItems]} :catererCart :keys [client-code client-location uuid] :as order}]
|
||||||
(let [adjustment (round-carry-cents (- (+ (-> order :totals :subTotal :subunits (* 0.01))
|
(let [adjustment (round-carry-cents (- (+ (-> order :totals :subTotal :subunits (* 0.01))
|
||||||
(-> order :totals :salesTax :subunits (* 0.01)))
|
(-> order :totals :salesTax :subunits (* 0.01)))
|
||||||
(-> order :catererCart :totals :catererTotalDue )
|
(-> order :catererCart :totals :catererTotalDue)
|
||||||
(commision order)
|
(commision order)
|
||||||
(ccp-fee order)))
|
(ccp-fee order)))
|
||||||
service-charge (+ (commision order) (ccp-fee order))
|
service-charge (+ (commision order) (ccp-fee order))
|
||||||
tax (-> order :totals :salesTax :subunits (* 0.01))
|
tax (-> order :totals :salesTax :subunits (* 0.01))
|
||||||
tip (-> order :totals :tip :subunits (* 0.01))]
|
tip (-> order :totals :tip :subunits (* 0.01))]
|
||||||
#:sales-order
|
#:sales-order
|
||||||
{:date (atime/localize (coerce/to-date-time timestamp))
|
{:date (atime/localize (coerce/to-date-time timestamp))
|
||||||
:external-id (str "ezcater/order/" client-code "-" client-location "-" uuid)
|
:external-id (str "ezcater/order/" client-code "-" client-location "-" uuid)
|
||||||
:client [:client/code client-code]
|
:client [:client/code client-code]
|
||||||
:location client-location
|
:location client-location
|
||||||
:reference-link (str (url/url "https://ezmanage.ezcater.com/orders/" uuid ))
|
:reference-link (str (url/url "https://ezmanage.ezcater.com/orders/" uuid))
|
||||||
:line-items [#:order-line-item
|
:line-items [#:order-line-item
|
||||||
{:external-id (str "ezcater/order/" client-code "-" client-location "-" uuid "-" 0)
|
{:external-id (str "ezcater/order/" client-code "-" client-location "-" uuid "-" 0)
|
||||||
:item-name "EZCater Catering"
|
:item-name "EZCater Catering"
|
||||||
:category "EZCater Catering"
|
:category "EZCater Catering"
|
||||||
:discount adjustment
|
:discount adjustment
|
||||||
:tax tax
|
:tax tax
|
||||||
:total (+ (-> order :totals :subTotal :subunits (* 0.01))
|
:total (+ (-> order :totals :subTotal :subunits (* 0.01))
|
||||||
tax
|
tax
|
||||||
tip)}]
|
tip)}]
|
||||||
:charges [#:charge
|
:charges [#:charge
|
||||||
{:type-name "CARD"
|
{:type-name "CARD"
|
||||||
:date (atime/localize (coerce/to-date-time timestamp))
|
:date (atime/localize (coerce/to-date-time timestamp))
|
||||||
:client [:client/code client-code]
|
:client [:client/code client-code]
|
||||||
:location client-location
|
:location client-location
|
||||||
:external-id (str "ezcater/charge/" uuid)
|
:external-id (str "ezcater/charge/" uuid)
|
||||||
:processor :ccp-processor/ezcater
|
:processor :ccp-processor/ezcater
|
||||||
:total (+ (-> order :totals :subTotal :subunits (* 0.01))
|
:total (+ (-> order :totals :subTotal :subunits (* 0.01))
|
||||||
tax
|
tax
|
||||||
tip)
|
tip)
|
||||||
:tip tip}]
|
:tip tip}]
|
||||||
|
|
||||||
:total (+ (-> order :totals :subTotal :subunits (* 0.01))
|
:total (+ (-> order :totals :subTotal :subunits (* 0.01))
|
||||||
tax
|
tax
|
||||||
tip)
|
tip)
|
||||||
:discount adjustment
|
:discount adjustment
|
||||||
:service-charge service-charge
|
:service-charge service-charge
|
||||||
:tax tax
|
:tax tax
|
||||||
:tip tip
|
:tip tip
|
||||||
:returns 0.0
|
:returns 0.0
|
||||||
:vendor :vendor/ccp-ezcater}))
|
:vendor :vendor/ccp-ezcater}))
|
||||||
|
|
||||||
|
(defn- flatten-order-to-parquet! [order]
|
||||||
|
"Flatten a sales-order into entity-type tagged maps and buffer to parquet."
|
||||||
|
(let [so-ext-id (:sales-order/external-id order)
|
||||||
|
so-date (some-> (:sales-order/date order) .toString)
|
||||||
|
client (:sales-order/client order)
|
||||||
|
client-code (if (map? client) (:client/code client) client)]
|
||||||
|
(parquet/buffer! "sales-order"
|
||||||
|
{:entity-type "sales-order"
|
||||||
|
:external-id so-ext-id
|
||||||
|
:client-code client-code
|
||||||
|
:location (:sales-order/location order)
|
||||||
|
:vendor (:sales-order/vendor order)
|
||||||
|
:total (:sales-order/total order)
|
||||||
|
:tax (:sales-order/tax order)
|
||||||
|
:tip (:sales-order/tip order)
|
||||||
|
:discount (:sales-order/discount order)
|
||||||
|
:service-charge (:sales-order/service-charge order)
|
||||||
|
:date so-date})
|
||||||
|
(when-let [charges (:sales-order/charges order)]
|
||||||
|
(doseq [chg charges]
|
||||||
|
(parquet/buffer! "charge"
|
||||||
|
{:entity-type "charge"
|
||||||
|
:external-id (:charge/external-id chg)
|
||||||
|
:type-name (:charge/type-name chg)
|
||||||
|
:total (:charge/total chg)
|
||||||
|
:tax (:charge/tax chg)
|
||||||
|
:tip (:charge/tip chg)
|
||||||
|
:date so-date
|
||||||
|
:processor (some-> (:charge/processor chg) name)
|
||||||
|
:sales-order-external-id so-ext-id})))
|
||||||
|
(when-let [items (:sales-order/line-items order)]
|
||||||
|
(doseq [li items]
|
||||||
|
(parquet/buffer! "line-item"
|
||||||
|
{:entity-type "line-item"
|
||||||
|
:item-name (:order-line-item/item-name li)
|
||||||
|
:category (:order-line-item/category li)
|
||||||
|
:total (:order-line-item/total li)
|
||||||
|
:tax (:order-line-item/tax li)
|
||||||
|
:discount (:order-line-item/discount li)
|
||||||
|
:sales-order-external-id so-ext-id})))))
|
||||||
|
|
||||||
(defn get-by-id [integration id]
|
(defn get-by-id [integration id]
|
||||||
(query
|
(query
|
||||||
integration
|
integration
|
||||||
{:venia/queries [[:order {:id id}
|
{:venia/queries [[:order {:id id}
|
||||||
[:uuid
|
[:uuid
|
||||||
:orderNumber
|
:orderNumber
|
||||||
:orderSourceType
|
:orderSourceType
|
||||||
[:caterer
|
[:caterer
|
||||||
[:name
|
[:name
|
||||||
:uuid
|
:uuid
|
||||||
[:address [:street]]]]
|
[:address [:street]]]]
|
||||||
[:event
|
[:event
|
||||||
[:timestamp
|
[:timestamp
|
||||||
:catererHandoffFoodTime
|
:catererHandoffFoodTime
|
||||||
:orderType]]
|
:orderType]]
|
||||||
[:catererCart [[:orderItems
|
[:catererCart [[:orderItems
|
||||||
[:name
|
[:name
|
||||||
:quantity
|
:quantity
|
||||||
:posItemId
|
:posItemId
|
||||||
[:totalInSubunits
|
[:totalInSubunits
|
||||||
[:currency
|
[:currency
|
||||||
:subunits]]]]
|
:subunits]]]]
|
||||||
[:totals
|
[:totals
|
||||||
[:catererTotalDue]]
|
[:catererTotalDue]]
|
||||||
[:feesAndDiscounts
|
[:feesAndDiscounts
|
||||||
{:type 'DELIVERY_FEE}
|
{:type 'DELIVERY_FEE}
|
||||||
[[:cost
|
[[:cost
|
||||||
[:currency
|
[:currency
|
||||||
:subunits]]]]]]
|
:subunits]]]]]]
|
||||||
[:totals [[:customerTotalDue
|
[:totals [[:customerTotalDue
|
||||||
[
|
[:currency
|
||||||
:currency
|
:subunits]]
|
||||||
:subunits
|
[:pointOfSaleIntegrationFee
|
||||||
]]
|
[:currency
|
||||||
[:pointOfSaleIntegrationFee
|
:subunits]]
|
||||||
[
|
[:tip
|
||||||
:currency
|
[:currency
|
||||||
:subunits
|
:subunits]]
|
||||||
]]
|
[:salesTax
|
||||||
[:tip
|
[:currency
|
||||||
[:currency
|
:subunits]]
|
||||||
:subunits]]
|
[:salesTaxRemittance
|
||||||
[:salesTax
|
[:currency
|
||||||
[
|
:subunits]]
|
||||||
:currency
|
[:subTotal
|
||||||
:subunits
|
[:currency
|
||||||
]]
|
:subunits]]]]]]]}))
|
||||||
[:salesTaxRemittance
|
|
||||||
[:currency
|
|
||||||
:subunits
|
|
||||||
]]
|
|
||||||
[:subTotal
|
|
||||||
[:currency
|
|
||||||
:subunits]]]]]]]}))
|
|
||||||
|
|
||||||
(defn lookup-order [json]
|
(defn lookup-order [json]
|
||||||
(let [caterer (get-caterer (get json "parent_id"))
|
(let [caterer (get-caterer (get json "parent_id"))
|
||||||
@@ -263,25 +294,30 @@
|
|||||||
location (-> caterer :ezcater-location/_caterer first :ezcater-location/location)]
|
location (-> caterer :ezcater-location/_caterer first :ezcater-location/location)]
|
||||||
(if (and client location)
|
(if (and client location)
|
||||||
(doto
|
(doto
|
||||||
(-> (get-by-id integration (get json "entity_id"))
|
(-> (get-by-id integration (get json "entity_id"))
|
||||||
(:order)
|
(:order)
|
||||||
(assoc :client-code client
|
(assoc :client-code client
|
||||||
:client-location location))
|
:client-location location))
|
||||||
(#(alog/info ::order-details :detail %)))
|
(#(alog/info ::order-details :detail %)))
|
||||||
(alog/warn ::caterer-no-longer-has-location :json json))))
|
(alog/warn ::caterer-no-longer-has-location :json json))))
|
||||||
|
|
||||||
(defn import-order [json]
|
(defn import-order [json]
|
||||||
;; {"id" "bf3dcf5c-a68f-42d9-9084-049133e03d3d", "parent_type" "Caterer", "parent_id" "91541331-d7ae-4634-9e8b-ccbbcfb2ce70", "entity_type" "Order", "entity_id" "9ab05fee-a9c5-483b-a7f2-14debde4b7a8", "key" "accepted", "occurred_at" "2022-07-21T19:21:07.549Z"}
|
|
||||||
(alog/info
|
(alog/info
|
||||||
::try-import-order
|
::try-import-order
|
||||||
:json json)
|
:json json)
|
||||||
@(dc/transact conn (filter identity
|
(when-let [order (some-> json
|
||||||
[(some-> json
|
(lookup-order)
|
||||||
(lookup-order)
|
(order->sales-order)
|
||||||
(order->sales-order)
|
(update :sales-order/date coerce/to-date)
|
||||||
(update :sales-order/date coerce/to-date)
|
(update-in [:sales-order/charges 0 :charge/date] coerce/to-date))]
|
||||||
(update-in [:sales-order/charges 0 :charge/date] coerce/to-date))])))
|
(try
|
||||||
|
(flatten-order-to-parquet! order)
|
||||||
|
(alog/info ::order-buffered
|
||||||
|
:external-id (:sales-order/external-id order))
|
||||||
|
(catch Exception e
|
||||||
|
(alog/error ::buffer-failed
|
||||||
|
:exception e
|
||||||
|
:order (:sales-order/external-id order))))))
|
||||||
(defn upsert-recent []
|
(defn upsert-recent []
|
||||||
(upsert-ezcater)
|
(upsert-ezcater)
|
||||||
(let [last-sunday (coerce/to-date (time/plus (second (->> (time/today)
|
(let [last-sunday (coerce/to-date (time/plus (second (->> (time/today)
|
||||||
@@ -312,30 +348,30 @@
|
|||||||
"key" "accepted",
|
"key" "accepted",
|
||||||
"occurred_at" "2022-07-21T19:21:07.549Z"}
|
"occurred_at" "2022-07-21T19:21:07.549Z"}
|
||||||
ezcater-order (lookup-order lookup-map)
|
ezcater-order (lookup-order lookup-map)
|
||||||
extant-order (dc/pull (dc/db conn) '[:sales-order/total
|
extant-order (dc/pull (dc/db conn '[:sales-order/total]
|
||||||
:sales-order/tax
|
:sales-order/tax
|
||||||
:sales-order/tip
|
:sales-order/tip
|
||||||
:sales-order/discount
|
:sales-order/discount
|
||||||
:sales-order/external-id
|
:sales-order/external-id
|
||||||
{:sales-order/charges [:charge/tax
|
{:sales-order/charges [:charge/tax
|
||||||
:charge/tip
|
:charge/tip
|
||||||
:charge/total
|
:charge/total
|
||||||
:charge/external-id]
|
:charge/external-id]
|
||||||
:sales-order/line-items [:order-line-item/external-id
|
:sales-order/line-items [:order-line-item/external-id
|
||||||
:order-line-item/total
|
:order-line-item/total
|
||||||
:order-line-item/tax
|
:order-line-item/tax
|
||||||
:order-line-item/discount]}]
|
:order-line-item/discount]})
|
||||||
[:sales-order/external-id order])
|
[:sales-order/external-id order])
|
||||||
|
|
||||||
updated-order (-> (order->sales-order ezcater-order)
|
updated-order (-> (order->sales-order ezcater-order)
|
||||||
(select-keys
|
(select-keys
|
||||||
#{:sales-order/total
|
#{:sales-order/total
|
||||||
:sales-order/tax
|
:sales-order/tax
|
||||||
:sales-order/tip
|
:sales-order/tip
|
||||||
:sales-order/discount
|
:sales-order/discount
|
||||||
:sales-order/charges
|
:sales-order/charges
|
||||||
:sales-order/external-id
|
:sales-order/external-id
|
||||||
:sales-order/line-items})
|
:sales-order/line-items})
|
||||||
(update :sales-order/line-items
|
(update :sales-order/line-items
|
||||||
(fn [c]
|
(fn [c]
|
||||||
(map #(select-keys % #{:order-line-item/external-id
|
(map #(select-keys % #{:order-line-item/external-id
|
||||||
|
|||||||
@@ -3,6 +3,7 @@
|
|||||||
[auto-ap.jobs.core :refer [execute]]
|
[auto-ap.jobs.core :refer [execute]]
|
||||||
[auto-ap.logging :as alog]
|
[auto-ap.logging :as alog]
|
||||||
[auto-ap.time :as atime]
|
[auto-ap.time :as atime]
|
||||||
|
[auto-ap.storage.parquet :as pq]
|
||||||
[clj-time.coerce :as c]
|
[clj-time.coerce :as c]
|
||||||
[clj-time.core :as time]
|
[clj-time.core :as time]
|
||||||
[clj-time.periodic :as per]
|
[clj-time.periodic :as per]
|
||||||
@@ -39,17 +40,14 @@
|
|||||||
(dc/db conn)
|
(dc/db conn)
|
||||||
number)))
|
number)))
|
||||||
|
|
||||||
|
|
||||||
(defn delete-all []
|
(defn delete-all []
|
||||||
@(dc/transact-async conn
|
@(dc/transact-async conn
|
||||||
(->>
|
(->>
|
||||||
(dc/q '[:find ?ss
|
(dc/q '[:find ?ss
|
||||||
:where [?ss :sales-summary/date]]
|
:where [?ss :sales-summary/date]]
|
||||||
(dc/db conn))
|
(dc/db conn))
|
||||||
(map (fn [[ ss]]
|
(map (fn [[ss]]
|
||||||
[:db/retractEntity ss])))))
|
[:db/retractEntity ss])))))
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
(defn dirty-sales-summaries [c]
|
(defn dirty-sales-summaries [c]
|
||||||
(let [client-id (dc/entid (dc/db conn) c)]
|
(let [client-id (dc/entid (dc/db conn) c)]
|
||||||
@@ -98,101 +96,94 @@
|
|||||||
"card refunds" 41400
|
"card refunds" 41400
|
||||||
"food app refunds" 41400})
|
"food app refunds" 41400})
|
||||||
|
|
||||||
(defn get-payment-items [c date]
|
(defn- get-payment-items-parquet [c date]
|
||||||
(->>
|
(let [date-str (.toString date)]
|
||||||
(dc/q '[:find ?processor ?type-name (sum ?total)
|
(when-let [rows (seq (pq/query-deduped "charge" date-str date-str))]
|
||||||
:with ?c
|
(let [client-code (if (map? c) (:client/code c) c)
|
||||||
:in $ [?clients ?start-date ?end-date]
|
filtered (filter #(= client-code (:client_code %)) rows)]
|
||||||
:where [(iol-ion.query/scan-sales-orders $ ?clients ?start-date ?end-date) [[?e _ ?sort-default] ...]]
|
(reduce
|
||||||
[?e :sales-order/charges ?c]
|
(fn [acc {:keys [processor type-name total]}]
|
||||||
[?c :charge/type-name ?type-name]
|
(update acc
|
||||||
(or-join [?c ?processor]
|
(cond
|
||||||
(and [?c :charge/processor ?p]
|
(= type-name "CARD") "Card Payments"
|
||||||
[?p :db/ident ?processor])
|
(= type-name "CASH") "Cash Payments"
|
||||||
(and
|
(#{"SQUARE_GIFT_CARD" "WALLET" "GIFT_CARD"} type-name) "Gift Card Payments"
|
||||||
(not [?c :charge/processor])
|
(#{"doordash" "grubhub" "uber-eats"} processor) "Food App Payments"
|
||||||
[(ground :ccp-processor/na) ?processor]))
|
:else "Unknown")
|
||||||
[?c :charge/total ?total]]
|
(fnil + 0.0)
|
||||||
(dc/db conn)
|
(or total 0.0)))
|
||||||
[[c] date date])
|
{}
|
||||||
(reduce
|
filtered)))))
|
||||||
(fn [acc [processor type-name total]]
|
|
||||||
(update
|
|
||||||
acc
|
|
||||||
(cond (= type-name "CARD")
|
|
||||||
"Card Payments"
|
|
||||||
(= type-name "CASH")
|
|
||||||
"Cash Payments"
|
|
||||||
(#{"SQUARE_GIFT_CARD" "WALLET" "GIFT_CARD"} type-name)
|
|
||||||
"Gift Card Payments"
|
|
||||||
(#{:ccp-processor/toast
|
|
||||||
#_:ccp-processor/ezcater
|
|
||||||
#_:ccp-processor/koala
|
|
||||||
:ccp-processor/doordash
|
|
||||||
:ccp-processor/grubhub
|
|
||||||
:ccp-processor/uber-eats} processor)
|
|
||||||
"Food App Payments"
|
|
||||||
:else
|
|
||||||
"Unknown")
|
|
||||||
(fnil + 0.0)
|
|
||||||
total))
|
|
||||||
{})
|
|
||||||
(map (fn [[k v]]
|
|
||||||
{:db/id (str (java.util.UUID/randomUUID))
|
|
||||||
:sales-summary-item/sort-order 0
|
|
||||||
:sales-summary-item/category k
|
|
||||||
|
|
||||||
:ledger-mapped/amount (if (= "Card Payments" k)
|
(defn- get-discounts-parquet [c date]
|
||||||
(- v (get-fee c date))
|
(let [client-code (if (map? c) (:client/code c) c)
|
||||||
v)
|
date-str (.toString date)
|
||||||
:ledger-mapped/ledger-side :ledger-side/debit}))))
|
discount (auto-ap.storage.sales-summaries/sum-discounts client-code date-str date-str)]
|
||||||
|
(when (and discount (pos? discount))
|
||||||
|
{:db/id (str (java.util.UUID/randomUUID))
|
||||||
|
:sales-summary-item/sort-order 1
|
||||||
|
:sales-summary-item/category "Discounts"
|
||||||
|
:ledger-mapped/amount discount
|
||||||
|
:ledger-mapped/ledger-side :ledger-side/debit})))
|
||||||
|
|
||||||
(defn get-discounts [c date]
|
(defn- get-refund-items-parquet [c date]
|
||||||
(when-let [discount (ffirst (dc/q '[:find (sum ?discount)
|
(let [client-code (if (map? c) (:client/code c) c)
|
||||||
:with ?e
|
date-str (.toString date)
|
||||||
:in $ [?clients ?start-date ?end-date]
|
refunds (auto-ap.storage.sales-summaries/sum-refunds-by-type client-code date-str date-str)]
|
||||||
:where [(iol-ion.query/scan-sales-orders $ ?clients ?start-date ?end-date) [[?e _ ?sort-default] ...]]
|
(when (seq refunds)
|
||||||
[?e :sales-order/discount ?discount]]
|
(map (fn [[type-name total]]
|
||||||
(dc/db conn)
|
{:db/id (str (java.util.UUID/randomUUID))
|
||||||
[[c] date date]))]
|
:sales-summary-item/sort-order 3
|
||||||
|
:sales-summary-item/category (cond
|
||||||
|
(= type-name "CARD") "Card Refunds"
|
||||||
|
(= type-name "CASH") "Cash Refunds"
|
||||||
|
:else "Food App Refunds")
|
||||||
|
:ledger-mapped/amount total
|
||||||
|
:ledger-mapped/ledger-side :ledger-side/credit})
|
||||||
|
refunds))))
|
||||||
|
|
||||||
|
(defn- get-fees [c date]
|
||||||
|
(when-let [fee (get-fee c date)]
|
||||||
{:db/id (str (java.util.UUID/randomUUID))
|
{:db/id (str (java.util.UUID/randomUUID))
|
||||||
:sales-summary-item/sort-order 1
|
:sales-summary-item/sort-order 2
|
||||||
:sales-summary-item/category "Discounts"
|
:sales-summary-item/category "Fees"
|
||||||
:ledger-mapped/amount discount
|
:ledger-mapped/amount fee
|
||||||
:ledger-mapped/ledger-side :ledger-side/debit}))
|
:ledger-mapped/ledger-side :ledger-side/debit}))
|
||||||
|
|
||||||
(defn get-refund-items [c date]
|
(defn- get-tax-parquet [c date]
|
||||||
(->>
|
(let [client-code (if (map? c) (:client/code c) c)
|
||||||
(dc/q '[:find ?type-name (sum ?t)
|
date-str (.toString date)
|
||||||
:with ?e
|
tax (auto-ap.storage.sales-summaries/sum-taxes client-code date-str date-str)]
|
||||||
:in $ [?clients ?start-date ?end-date]
|
{:db/id (str (java.util.UUID/randomUUID))
|
||||||
:where
|
:sales-summary-item/category "Tax"
|
||||||
:where [(iol-ion.query/scan-sales-refunds $ ?clients ?start-date ?end-date) [[?e _ ?sort-default] ...]]
|
:sales-summary-item/sort-order 1
|
||||||
[?e :sales-refund/type ?type-name]
|
:ledger-mapped/ledger-side :ledger-side/credit
|
||||||
[?e :sales-refund/total ?t]]
|
:ledger-mapped/amount (or tax 0.0)}))
|
||||||
(dc/db conn)
|
|
||||||
[[c] date date])
|
|
||||||
(reduce
|
|
||||||
(fn [acc [type-name total]]
|
|
||||||
(update
|
|
||||||
acc
|
|
||||||
(cond (= type-name "CARD")
|
|
||||||
"Card Refunds"
|
|
||||||
(= type-name "CASH")
|
|
||||||
"Cash Refunds"
|
|
||||||
:else
|
|
||||||
"Food App Refunds")
|
|
||||||
(fnil + 0.0)
|
|
||||||
total))
|
|
||||||
{})
|
|
||||||
(map (fn [[k v]]
|
|
||||||
{:db/id (str (java.util.UUID/randomUUID))
|
|
||||||
:sales-summary-item/sort-order 3
|
|
||||||
:sales-summary-item/category k
|
|
||||||
:ledger-mapped/amount v
|
|
||||||
:ledger-mapped/ledger-side :ledger-side/credit}))))
|
|
||||||
|
|
||||||
|
(defn- get-tip-parquet [c date]
|
||||||
|
(let [client-code (if (map? c) (:client/code c) c)
|
||||||
|
date-str (.toString date)
|
||||||
|
tip (auto-ap.storage.sales-summaries/sum-tips client-code date-str date-str)]
|
||||||
|
{:ledger-mapped/ledger-side :ledger-side/credit
|
||||||
|
:sales-summary-item/sort-order 2
|
||||||
|
:db/id (str (java.util.UUID/randomUUID))
|
||||||
|
:sales-summary-item/category "Tip"
|
||||||
|
:ledger-mapped/amount (or tip 0.0)}))
|
||||||
|
|
||||||
|
(defn- get-sales-parquet [c date]
|
||||||
|
(let [client-code (if (map? c) (:client/code c) c)
|
||||||
|
date-str (.toString date)
|
||||||
|
sales (auto-ap.storage.sales-summaries/sum-sales-by-category client-code date-str date-str)]
|
||||||
|
(for [{:keys [category total tax discount]} sales]
|
||||||
|
{:db/id (str (java.util.UUID/randomUUID))
|
||||||
|
:sales-summary-item/category (or category "Unknown")
|
||||||
|
:sales-summary-item/sort-order 0
|
||||||
|
:sales-summary-item/total total
|
||||||
|
:sales-summary-item/net (- (+ total discount) tax)
|
||||||
|
:sales-summary-item/tax tax
|
||||||
|
:sales-summary-item/discount discount
|
||||||
|
:ledger-mapped/ledger-side :ledger-side/credit
|
||||||
|
:ledger-mapped/amount (- (+ total discount) tax)})))
|
||||||
|
|
||||||
(defn get-fees [c date]
|
(defn get-fees [c date]
|
||||||
(when-let [fee (get-fee c date)]
|
(when-let [fee (get-fee c date)]
|
||||||
@@ -293,19 +284,17 @@
|
|||||||
|
|
||||||
:sales-summary/items
|
:sales-summary/items
|
||||||
(->>
|
(->>
|
||||||
(get-sales c date)
|
(get-sales-parquet c date)
|
||||||
(concat (get-payment-items c date))
|
(concat (get-payment-items-parquet c date))
|
||||||
(concat (get-refund-items c date))
|
(concat (get-refund-items-parquet c date))
|
||||||
(cons (get-discounts c date))
|
(cons (get-discounts-parquet c date))
|
||||||
(cons (get-fees c date))
|
(cons (get-fees c date))
|
||||||
(cons (get-tax c date))
|
(cons (get-tax-parquet c date))
|
||||||
(cons (get-tip c date))
|
(cons (get-tip-parquet c date))
|
||||||
(cons (get-returns c date))
|
|
||||||
(filter identity)
|
(filter identity)
|
||||||
(map (fn [z]
|
(map (fn [z]
|
||||||
(assoc z :ledger-mapped/account (some-> z :sales-summary-item/category str/lower-case name->number lookup-account)
|
(assoc z :ledger-mapped/account (some-> z :sales-summary-item/category str/lower-case name->number lookup-account)
|
||||||
:sales-summary-item/manual? false))
|
:sales-summary-item/manual? false))))}]
|
||||||
)) }]
|
|
||||||
(if (seq (:sales-summary/items result))
|
(if (seq (:sales-summary/items result))
|
||||||
(do
|
(do
|
||||||
(alog/info ::upserting-summaries
|
(alog/info ::upserting-summaries
|
||||||
@@ -313,12 +302,11 @@
|
|||||||
@(dc/transact conn [[:upsert-entity result]]))
|
@(dc/transact conn [[:upsert-entity result]]))
|
||||||
@(dc/transact conn [{:db/id id :sales-summary/dirty false}]))))))
|
@(dc/transact conn [{:db/id id :sales-summary/dirty false}]))))))
|
||||||
|
|
||||||
(let [c (auto-ap.datomic/pull-attr (dc/db conn) :db/id [:client/code "NGCL" ])
|
(comment
|
||||||
date #inst "2024-04-14T00:00:00-07:00"]
|
;; TODO: Move to test file or proper location
|
||||||
(get-payment-items c date)
|
(let [c (auto-ap.datomic/pull-attr (dc/db @conn) :db/id [:client/code "NGCL"])
|
||||||
|
date #inst "2024-04-14T00:00:00-07:00"]
|
||||||
)
|
(get-payment-items c date)))
|
||||||
|
|
||||||
|
|
||||||
(defn reset-summaries []
|
(defn reset-summaries []
|
||||||
@(dc/transact conn (->> (dc/q '[:find ?sos
|
@(dc/transact conn (->> (dc/q '[:find ?sos
|
||||||
@@ -328,16 +316,13 @@
|
|||||||
(map (fn [[sos]]
|
(map (fn [[sos]]
|
||||||
[:db/retractEntity sos])))))
|
[:db/retractEntity sos])))))
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
(comment
|
(comment
|
||||||
(auto-ap.datomic/transact-schema conn)
|
(auto-ap.datomic/transact-schema conn)
|
||||||
|
|
||||||
@(dc/transact conn [{:db/ident :sales-summary/total-unknown-processor-payments
|
@(dc/transact conn [{:db/ident :sales-summary/total-unknown-processor-payments
|
||||||
:db/noHistory true,
|
:db/noHistory true,
|
||||||
:db/valueType :db.type/double
|
:db/valueType :db.type/double
|
||||||
:db/cardinality :db.cardinality/one}])
|
:db/cardinality :db.cardinality/one}])
|
||||||
|
|
||||||
(apply mark-dirty [:client/code "NGCL"] (last-n-days 30))
|
(apply mark-dirty [:client/code "NGCL"] (last-n-days 30))
|
||||||
|
|
||||||
@@ -369,23 +354,18 @@
|
|||||||
(dc/db conn)
|
(dc/db conn)
|
||||||
[[(auto-ap.datomic/pull-attr (dc/db conn) :db/id [:client/code "NGHW"])] #inst "2024-04-11T00:00:00-07:00" #inst "2024-04-11T00:00:00-07:00"])
|
[[(auto-ap.datomic/pull-attr (dc/db conn) :db/id [:client/code "NGHW"])] #inst "2024-04-11T00:00:00-07:00" #inst "2024-04-11T00:00:00-07:00"])
|
||||||
|
|
||||||
(dc/q '[:find ?n
|
(dc/q '[:find ?n
|
||||||
:in $ [?clients ?start-date ?end-date]
|
:in $ [?clients ?start-date ?end-date]
|
||||||
:where [(iol-ion.query/scan-sales-orders $ ?clients ?start-date ?end-date) [[?e _ ?sort-default] ...]]
|
:where [(iol-ion.query/scan-sales-orders $ ?clients ?start-date ?end-date) [[?e _ ?sort-default] ...]]
|
||||||
[?e :sales-order/line-items ?li]
|
[?e :sales-order/line-items ?li]
|
||||||
[?li :order-line-item/item-name ?n] ]
|
[?li :order-line-item/item-name ?n]]
|
||||||
(dc/db conn)
|
(dc/db conn)
|
||||||
[[(auto-ap.datomic/pull-attr (dc/db conn) :db/id [:client/code "NGCL"])] #inst "2024-04-11T00:00:00-07:00" #inst "2024-04-24T00:00:00-07:00"])
|
[[(auto-ap.datomic/pull-attr (dc/db conn) :db/id [:client/code "NGCL"])] #inst "2024-04-11T00:00:00-07:00" #inst "2024-04-24T00:00:00-07:00"])
|
||||||
|
|
||||||
@(dc/transact conn [{:db/id :sales-summary/total-tax :db/ident :sales-summary/total-tax-legacy}
|
@(dc/transact conn [{:db/id :sales-summary/total-tax :db/ident :sales-summary/total-tax-legacy}
|
||||||
{:db/id :sales-summary/total-tip :db/ident :sales-summary/total-tip-legacy}])
|
{:db/id :sales-summary/total-tip :db/ident :sales-summary/total-tip-legacy}])
|
||||||
|
|
||||||
(auto-ap.datomic/transact-schema conn)
|
|
||||||
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
|
(auto-ap.datomic/transact-schema conn))
|
||||||
|
|
||||||
(defn -main [& _]
|
(defn -main [& _]
|
||||||
(execute "sales-summaries" sales-summaries-v2))
|
(execute "sales-summaries" sales-summaries-v2))
|
||||||
|
|
||||||
219
src/clj/auto_ap/migration/cleanup_sales.clj
Normal file
219
src/clj/auto_ap/migration/cleanup_sales.clj
Normal file
@@ -0,0 +1,219 @@
|
|||||||
|
(ns auto-ap.migration.cleanup-sales
|
||||||
|
(:require [auto-ap.datomic :refer [conn]]
|
||||||
|
[auto-ap.storage.parquet :as pq]
|
||||||
|
[amazonica.aws.s3 :as s3]
|
||||||
|
[datomic.api :as d-api]
|
||||||
|
[clojure.string :as str]))
|
||||||
|
|
||||||
|
(def ^:private BATCH-SIZE 1000)
|
||||||
|
(def ^:private DRY-RUN? true)
|
||||||
|
|
||||||
|
(defn- set-dry-run! [v]
|
||||||
|
(alter-var-root #'DRY-RUN? (constantly v)))
|
||||||
|
|
||||||
|
; -- query helpers
|
||||||
|
|
||||||
|
(defn- query-sales-order-ids
|
||||||
|
"Return all entity IDs that have :sales-order/external-id."
|
||||||
|
[db]
|
||||||
|
(->> (d-api/q '[:find ?e
|
||||||
|
:where [?e :sales-order/external-id]]
|
||||||
|
db)
|
||||||
|
(map first)))
|
||||||
|
|
||||||
|
(defn- collect-child-ids
|
||||||
|
"Gather child entity IDs for a batch of sales orders. Returns map with
|
||||||
|
keys :orders, :charges, :line-items, :refunds — each a vector of
|
||||||
|
entity IDs eligible for retraction."
|
||||||
|
[db order-ids]
|
||||||
|
(let [order-set (set order-ids)
|
||||||
|
charges (->> (d-api/q '[:find ?c
|
||||||
|
:in $ [?o ...]
|
||||||
|
:where [$ ?o :sales-order/charges ?c]]
|
||||||
|
db order-set)
|
||||||
|
(map second))
|
||||||
|
refunds (->> (d-api/q '[:find ?r
|
||||||
|
:in $ [?o ...]
|
||||||
|
:where [$ ?o :sales-order/refunds ?r]]
|
||||||
|
db order-set)
|
||||||
|
(map second))
|
||||||
|
line-items (->> (d-api/q '[:find ?li
|
||||||
|
:in $ [?c ...]
|
||||||
|
:where [$ ?c :charge/line-items ?li]]
|
||||||
|
db charges)
|
||||||
|
(map second))]
|
||||||
|
{:orders order-ids
|
||||||
|
:charges (vec charges)
|
||||||
|
:line-items (vec line-items)
|
||||||
|
:refunds (vec refunds)}))
|
||||||
|
|
||||||
|
; -- transaction batching
|
||||||
|
|
||||||
|
(defn- batch-transact
|
||||||
|
"Issue [:db/retractEntity ...] transactions in batches of BATCH-SIZE.
|
||||||
|
conn$ is a Datomic connection object.
|
||||||
|
entity-ids should be a seq of Long entity IDs."
|
||||||
|
[conn entity-ids]
|
||||||
|
(let [batches (partition-all BATCH-SIZE entity-ids)
|
||||||
|
_ (doseq [[idx batch] (map-indexed vector batches)]
|
||||||
|
(let [n (count batch)
|
||||||
|
txes (map (fn [eid]
|
||||||
|
[:db/retractEntity eid])
|
||||||
|
batch)]
|
||||||
|
(println " batch" idx ":" n "retracts")
|
||||||
|
(when-not DRY-RUN?
|
||||||
|
@(d-api/transact conn txes))))]
|
||||||
|
:done))
|
||||||
|
|
||||||
|
(defn- retract-all-child-ids!
|
||||||
|
"Retract orders, charges, line-items and refunds from all entity-ID
|
||||||
|
maps produced by collect-child-ids. Logs progress every batch."
|
||||||
|
[conn child-entity-map]
|
||||||
|
(doseq [[type id-seq] child-entity-map]
|
||||||
|
(when (seq id-seq)
|
||||||
|
(println "retracting" type ":" (count id-seq) "ids")
|
||||||
|
(batch-transact conn id-seq))))
|
||||||
|
|
||||||
|
; -- month grouping
|
||||||
|
|
||||||
|
(defn- group-orders-by-month
|
||||||
|
"Group sales order entity IDs by [year month] extracted from
|
||||||
|
:sales-order/day-value. Returns map {{y m} [eid ...]}."
|
||||||
|
[db order-ids]
|
||||||
|
(reduce (fn [acc eid]
|
||||||
|
(when-let [day-val (:sales-order/day-value
|
||||||
|
(d-api/entity db eid))]
|
||||||
|
(let [[y m _] (str/split (str day-val) #"-")
|
||||||
|
k [(Integer/parseInt y)
|
||||||
|
(Integer/parseInt m)]]
|
||||||
|
(update acc k conj eid))))
|
||||||
|
{}
|
||||||
|
order-ids))
|
||||||
|
|
||||||
|
; -- S3 verification (uses amazonica + parquet module)
|
||||||
|
|
||||||
|
(def ENTITY-TYPES ["sales-order" "charge"
|
||||||
|
"line-item" "sales-refund"])
|
||||||
|
|
||||||
|
(defn- s3-keys-for-date
|
||||||
|
"Build S3 parquet keys for all entity types on a given date."
|
||||||
|
[date-str]
|
||||||
|
(mapv #(pq/parquet-key % date-str) ENTITY-TYPES))
|
||||||
|
|
||||||
|
(defn- days-in-month
|
||||||
|
"Return seq of YYYY-MM-DD strings for all days in [year month]."
|
||||||
|
[year month]
|
||||||
|
(let [start (java.time.LocalDate/of year month 1)
|
||||||
|
first-of-next (.plusMonths start 1)
|
||||||
|
diff (.toEpochDay first-of-next)
|
||||||
|
start-day (.toEpochDay start)]
|
||||||
|
(for [d (range start-day diff)]
|
||||||
|
(.toString (java.time.LocalDate/ofEpochDay d)))))
|
||||||
|
|
||||||
|
(defn- object-exists?
|
||||||
|
"Check if an S3 object exists by attempting get-object."
|
||||||
|
[key]
|
||||||
|
(try
|
||||||
|
(s3/get-object {:bucket-name pq/*bucket*
|
||||||
|
:key key})
|
||||||
|
true
|
||||||
|
(catch com.amazonaws.services.s3.model.AmazonS3Exception _
|
||||||
|
false)))
|
||||||
|
|
||||||
|
(defn- verify-month-in-s3?
|
||||||
|
"Check that every day in [year month] has at least one backing
|
||||||
|
Parquet file on S3 across all entity types.
|
||||||
|
Returns a map {:ok bool :missing vec-of-dates}."
|
||||||
|
[year month]
|
||||||
|
(let [dates (days-in-month year month)]
|
||||||
|
(loop [[d & rest] dates
|
||||||
|
result []]
|
||||||
|
(if-not d
|
||||||
|
{:ok (empty? result)
|
||||||
|
:missing result}
|
||||||
|
(let [keys (s3-keys-for-date d)
|
||||||
|
found? (some object-exists? keys)]
|
||||||
|
(recur rest
|
||||||
|
(if found?
|
||||||
|
result
|
||||||
|
(conj result d))))))))
|
||||||
|
|
||||||
|
; -- public API: delete-by-month
|
||||||
|
|
||||||
|
(defn- delete-by-month [conn client-entid year month]
|
||||||
|
"Retract all sales entities for a specific year+month.
|
||||||
|
Returns :ok on success, :skipped if S3 verification failed."
|
||||||
|
(println "=== deleting" year "-" month
|
||||||
|
"dry-run? =" DRY-RUN?)
|
||||||
|
(let [db (d-api/db conn)
|
||||||
|
all-ids (query-sales-order-ids db)
|
||||||
|
group (group-orders-by-month db all-ids)
|
||||||
|
target-keys (get group [year month] [])]
|
||||||
|
(if (zero? (count target-keys))
|
||||||
|
(do (println " no orders found for" year "-" month)
|
||||||
|
:skipped)
|
||||||
|
(do
|
||||||
|
(let [child-maps (collect-child-ids db target-keys)
|
||||||
|
total-ids (->> child-maps vals
|
||||||
|
(reduce into [])
|
||||||
|
distinct
|
||||||
|
count)]
|
||||||
|
(println " " total-ids "total entities to retract")
|
||||||
|
(when-not DRY-RUN?
|
||||||
|
(retract-all-child-ids! conn child-maps)))
|
||||||
|
:ok))))
|
||||||
|
|
||||||
|
; -- public API: cleanup-all
|
||||||
|
|
||||||
|
(defn cleanup-all []
|
||||||
|
"Remove ALL sales-order, charge, line-item, sales-refund from
|
||||||
|
Datomic. Uses d-api/transact to issue [:db/retractEntity ...] for
|
||||||
|
each entity. Iterates over every month found in DB."
|
||||||
|
(let [db (d-api/db conn)
|
||||||
|
all-ids (query-sales-order-ids db)
|
||||||
|
group (group-orders-by-month db all-ids)
|
||||||
|
months (sort (keys group))]
|
||||||
|
(println "found" (count months) "months of data")
|
||||||
|
(doseq [[y m] months]
|
||||||
|
(delete-by-month conn nil y m))
|
||||||
|
(println "cleanup-all complete")))
|
||||||
|
|
||||||
|
; -- public API: safe-cleanup-all
|
||||||
|
|
||||||
|
(defn- collect-all-months [conn]
|
||||||
|
"Return sorted vec of [year month] pairs with sales orders in DB."
|
||||||
|
(let [db (d-api/db conn)
|
||||||
|
all-ids (query-sales-order-ids db)
|
||||||
|
grouped (group-orders-by-month db all-ids)]
|
||||||
|
(sort (keys grouped))))
|
||||||
|
|
||||||
|
(defn safe-cleanup-all []
|
||||||
|
"Same as cleanup-all but verifies S3 data exists first.
|
||||||
|
Before deleting a month's entities, checks that parquet files
|
||||||
|
exist in auto-ap.storage.parquet bucket under prefix 'sales-details'."
|
||||||
|
(let [conn$ conn
|
||||||
|
months (collect-all-months conn)]
|
||||||
|
(println "=== safe-cleanup-all"
|
||||||
|
"months:" (count months)
|
||||||
|
"dry-run? =" DRY-RUN?)
|
||||||
|
(doseq [[_ y m] months]
|
||||||
|
(when-not DRY-RUN?
|
||||||
|
(let [result (verify-month-in-s3? y m)
|
||||||
|
missing (:missing result)]
|
||||||
|
(cond
|
||||||
|
(:ok result)
|
||||||
|
(do (println "verified" y "-" m "S3 OK, deleting...")
|
||||||
|
(delete-by-month conn$ nil y m))
|
||||||
|
|
||||||
|
(> (count missing) 0)
|
||||||
|
(do (println "ERROR" y "-" m "missing in S3:"
|
||||||
|
(str/join ", " missing))
|
||||||
|
(throw
|
||||||
|
(ex-info
|
||||||
|
"Missing S3 data — aborting!"
|
||||||
|
{:year y :month m
|
||||||
|
:missing missing})))
|
||||||
|
|
||||||
|
:else
|
||||||
|
(println "SKIPPING" y "-" m "no parquet files")))))
|
||||||
|
(println "safe-cleanup-all complete")))
|
||||||
239
src/clj/auto_ap/migration/sales_to_parquet.clj
Normal file
239
src/clj/auto_ap/migration/sales_to_parquet.clj
Normal file
@@ -0,0 +1,239 @@
|
|||||||
|
(ns auto-ap.migration.sales-to-parquet
|
||||||
|
"Migrate historical sales data from Datomic to Parquet + S3.
|
||||||
|
|
||||||
|
Groups records by business date and writes daily partitions.
|
||||||
|
Dead-letter records (missing dates) are written separately.
|
||||||
|
|
||||||
|
Usage:
|
||||||
|
(migrate-all) ; full migration earliest → latest
|
||||||
|
(write-day-by-day \"2024-01-01\" \"2024-03-31\") ; date range
|
||||||
|
(write-dead-letter [flat]) ; write orphaned records"
|
||||||
|
(:require [auto-ap.datomic :refer [conn]]
|
||||||
|
[auto-ap.storage.parquet :as p]
|
||||||
|
[clojure.string :as str]
|
||||||
|
[datomic.api :as dc]))
|
||||||
|
|
||||||
|
(defn- fetch-all-sales-order-ids []
|
||||||
|
"Query Datomic for all sales-order external-ids (as entity IDs).
|
||||||
|
Returns a vector of entitity ids."
|
||||||
|
(->> (dc/q '[:find ?e
|
||||||
|
:where [?e :sales-order/external-id _]]
|
||||||
|
(dc/db conn))
|
||||||
|
(map first)
|
||||||
|
vec))
|
||||||
|
|
||||||
|
(def ^:private sales-order-read
|
||||||
|
'[:sales-order/external-id
|
||||||
|
:sales-order/date
|
||||||
|
{:sales-order/client [:client/code :client/name]}
|
||||||
|
:sales-order/location
|
||||||
|
{:sales-order/vendor [:vendor/name]}
|
||||||
|
:sales-order/total
|
||||||
|
:sales-order/tax
|
||||||
|
:sales-order/tip
|
||||||
|
:sales-order/discount
|
||||||
|
:sales-order/service-charge
|
||||||
|
:sales-order/source
|
||||||
|
:sales-order/reference-link
|
||||||
|
{:sales-order/charges
|
||||||
|
[:charge/external-id
|
||||||
|
:charge/type-name
|
||||||
|
:charge/total
|
||||||
|
:charge/tax
|
||||||
|
:charge/tip
|
||||||
|
:charge/date
|
||||||
|
{:charge/processor [:db/ident]}
|
||||||
|
:charge/returns
|
||||||
|
{:charge/client [:client/code]}]}
|
||||||
|
{:sales-order/line-items
|
||||||
|
[:order-line-item/item-name
|
||||||
|
:order-line-item/category
|
||||||
|
:order-line-item/total
|
||||||
|
:order-line-item/tax
|
||||||
|
:order-line-item/discount
|
||||||
|
:order-line-item/unit-price
|
||||||
|
:order-line-item/quantity
|
||||||
|
:order-line-item/note]}])
|
||||||
|
|
||||||
|
(defn- pull-sales-order-data [eids]
|
||||||
|
"Batch pull full sales-order entities plus nested children."
|
||||||
|
(if (empty? eids)
|
||||||
|
[]
|
||||||
|
(dc/pull-many (dc/db conn)
|
||||||
|
sales-order-read
|
||||||
|
eids)))
|
||||||
|
|
||||||
|
(defn- flatten-order-to-pieces! [order date-str flat]
|
||||||
|
"Flatten a pulled sales-order into :entity-type tagged maps.
|
||||||
|
Appends to the existing flat vector, which is returned."
|
||||||
|
(let [so-ext-id (:sales-order/external-id order)
|
||||||
|
so-date date-str
|
||||||
|
client-code (get-in order [:sales-order/client :client/code])
|
||||||
|
vendor-name (get-in order [:sales-order/vendor :vendor/name])
|
||||||
|
charges (:sales-order/charges order)
|
||||||
|
items (:sales-order/line-items order)
|
||||||
|
payment-methods (->> charges (map :charge/type-name) distinct (str/join ","))
|
||||||
|
processors (->> charges (map #(get-in % [:charge/processor :db/ident])) (remove nil?) distinct (map name) (str/join ","))
|
||||||
|
categories (->> items (map :order-line-item/category) (remove nil?) distinct (str/join ","))]
|
||||||
|
(vswap! flat conj
|
||||||
|
{:entity-type "sales-order"
|
||||||
|
:external-id (str so-ext-id)
|
||||||
|
:client-code client-code
|
||||||
|
:location (:sales-order/location order)
|
||||||
|
:vendor vendor-name
|
||||||
|
:total (:sales-order/total order)
|
||||||
|
:tax (:sales-order/tax order)
|
||||||
|
:tip (:sales-order/tip order)
|
||||||
|
:discount (:sales-order/discount order)
|
||||||
|
:service-charge (:sales-order/service-charge order)
|
||||||
|
:date so-date
|
||||||
|
:source (:sales-order/source order)
|
||||||
|
:reference-link (:sales-order/reference-link order)
|
||||||
|
:payment-methods payment-methods
|
||||||
|
:processors processors
|
||||||
|
:categories categories})
|
||||||
|
(when-let [charges (:sales-order/charges order)]
|
||||||
|
(doseq [chg charges]
|
||||||
|
(vswap! flat conj
|
||||||
|
{:entity-type "charge"
|
||||||
|
:external-id (str (get chg :charge/external-id))
|
||||||
|
:type-name (get chg :charge/type-name)
|
||||||
|
:total (get chg :charge/total)
|
||||||
|
:tax (get chg :charge/tax)
|
||||||
|
:tip (get chg :charge/tip)
|
||||||
|
:date so-date
|
||||||
|
:processor (get-in chg [:charge/processor :db/ident])
|
||||||
|
:sales-order-external-id (str so-ext-id)})
|
||||||
|
(when-let [returns (:charge/returns chg)]
|
||||||
|
(doseq [rt returns]
|
||||||
|
(vswap! flat conj
|
||||||
|
{:entity-type "sales-refund"
|
||||||
|
:type-name (get rt :type-name)
|
||||||
|
:total (get rt :total)
|
||||||
|
:sales-order-external-id (str so-ext-id)})))))
|
||||||
|
(when-let [items (:sales-order/line-items order)]
|
||||||
|
(doseq [li items]
|
||||||
|
(vswap! flat conj
|
||||||
|
{:entity-type "line-item"
|
||||||
|
:item-name (get li :order-line-item/item-name)
|
||||||
|
:category (get li :order-line-item/category)
|
||||||
|
:total (get li :order-line-item/total)
|
||||||
|
:tax (get li :order-line-item/tax)
|
||||||
|
:discount (get li :order-line-item/discount)
|
||||||
|
:sales-order-external-id (str so-ext-id)})))))
|
||||||
|
|
||||||
|
(defn -fetch-order-ids-for-date
|
||||||
|
"Query Datomic for all sales-order eids on a given business date."
|
||||||
|
[db date-str]
|
||||||
|
(let [ld (java.time.LocalDate/parse date-str)
|
||||||
|
start (-> ld (.atStartOfDay (java.time.ZoneId/of "America/Los_Angeles")) .toInstant java.util.Date/from)
|
||||||
|
end (-> ld (.plusDays 1) (.atStartOfDay (java.time.ZoneId/of "America/Los_Angeles")) .toInstant java.util.Date/from)]
|
||||||
|
(->> (dc/q '[:find ?e
|
||||||
|
:in $ ?start ?end
|
||||||
|
:where [?e :sales-order/date ?d]
|
||||||
|
[(>= ?d ?start)]
|
||||||
|
[(< ?d ?end)]]
|
||||||
|
db start end)
|
||||||
|
(map first)
|
||||||
|
vec)))
|
||||||
|
|
||||||
|
(defn- date-seq [start end]
|
||||||
|
"Seq of YYYY-MM-DD strings between start and end inclusive."
|
||||||
|
(let [sd (java.time.LocalDate/parse start)
|
||||||
|
ed (java.time.LocalDate/parse end)
|
||||||
|
days (int (Math/abs (- (.toEpochDay sd)
|
||||||
|
(.toEpochDay ed))))]
|
||||||
|
(for [i (range 0 (inc days))]
|
||||||
|
(.toString (.plusDays sd i)))))
|
||||||
|
|
||||||
|
(defn write-day-by-day
|
||||||
|
([start-date end-date]
|
||||||
|
(write-day-by-day start-date end-date {}))
|
||||||
|
([start-date end-date opts]
|
||||||
|
(let [all-dates (set (or (opts :date-set) []))
|
||||||
|
date-range (if (empty? all-dates)
|
||||||
|
(date-seq start-date end-date)
|
||||||
|
(filter all-dates
|
||||||
|
(date-seq start-date end-date)))
|
||||||
|
batch-size (or (opts :batch-size) 100)]
|
||||||
|
(doseq [^String day date-range]
|
||||||
|
(println "[migration] processing" day)
|
||||||
|
(let [eids (-fetch-order-ids-for-date (dc/db conn) day)
|
||||||
|
batches (partition-all batch-size eids)]
|
||||||
|
(doseq [batch batches]
|
||||||
|
(let [orders (pull-sales-order-data batch)
|
||||||
|
flat (volatile! [])]
|
||||||
|
(doseq [o orders]
|
||||||
|
(flatten-order-to-pieces! o day flat))
|
||||||
|
(doseq [r @flat]
|
||||||
|
(p/buffer! (:entity-type r) r)))))
|
||||||
|
(doseq [etype ["sales-order" "charge"
|
||||||
|
"line-item" "sales-refund"]]
|
||||||
|
(p/flush-to-parquet! etype day))
|
||||||
|
(println "[migration]" day "complete"))
|
||||||
|
{:status :completed :total-days (count date-range)})))
|
||||||
|
|
||||||
|
(defn- write-dead-letter
|
||||||
|
([flat]
|
||||||
|
(write-dead-letter "dead" flat))
|
||||||
|
([prefix flat]
|
||||||
|
"Write records with missing dates to a parquet file."
|
||||||
|
(let [dead (filter #(nil? (:date %)) flat)]
|
||||||
|
(when (seq dead)
|
||||||
|
(doseq [r dead]
|
||||||
|
(p/buffer!
|
||||||
|
(str prefix "-" (:entity-type r))
|
||||||
|
r))))))
|
||||||
|
|
||||||
|
(defn- flush-all-types []
|
||||||
|
"Flush all entity-type buffers, tracking counts."
|
||||||
|
(let [etypes ["sales-order" "charge"
|
||||||
|
"line-item" "sales-refund"]
|
||||||
|
today (.toString (java.time.LocalDate/now))
|
||||||
|
start (p/total-buf-count)]
|
||||||
|
(doseq [et etypes]
|
||||||
|
(try
|
||||||
|
(p/flush-to-parquet! et today)
|
||||||
|
(catch Exception e
|
||||||
|
(println "[migration/flush]" et "error:" (.getMessage e)))))
|
||||||
|
{:records-flush (- (p/total-buf-count) start)}))
|
||||||
|
|
||||||
|
(defn- get-date-range []
|
||||||
|
"Get the earliest and latest business dates from Datomic."
|
||||||
|
(let [dates (->> (dc/q '[:find ?d
|
||||||
|
:where [_ :sales-order/date ?d]]
|
||||||
|
(dc/db conn))
|
||||||
|
(map first)
|
||||||
|
distinct
|
||||||
|
sort)]
|
||||||
|
[(when (seq dates) (.toString (first dates)))
|
||||||
|
(when (seq dates) (.toString (last dates)))]))
|
||||||
|
|
||||||
|
(defn migrate-all []
|
||||||
|
"Full migration from earliest to latest date: load unflushed,
|
||||||
|
fetch / buffer / flush day by day. Write dead-records for
|
||||||
|
sales orders with missing dates."
|
||||||
|
(println "[migration] starting full migration...")
|
||||||
|
(p/load-unflushed!)
|
||||||
|
(let [order-ids (fetch-all-sales-order-ids)
|
||||||
|
start-date (first (get-date-range))
|
||||||
|
end-date (second (get-date-range))]
|
||||||
|
(if-not (seq order-ids)
|
||||||
|
(do
|
||||||
|
(println "[migration] no orders found")
|
||||||
|
:no-orders)
|
||||||
|
(try
|
||||||
|
;; pull & buffer any orders missing a business date
|
||||||
|
(doseq [o (pull-sales-order-data order-ids)
|
||||||
|
:when (not (:sales-order/date o))]
|
||||||
|
(let [flat (volatile! [])]
|
||||||
|
(flatten-order-to-pieces! o "unknown" flat)
|
||||||
|
(doseq [r @flat]
|
||||||
|
(p/buffer! "dead" r))))
|
||||||
|
(write-day-by-day start-date end-date {:batch-size 100})
|
||||||
|
(flush-all-types)
|
||||||
|
(println "[migration] done")
|
||||||
|
:ok
|
||||||
|
(catch Exception e
|
||||||
|
(println "[migration/error]" (.getMessage e))
|
||||||
|
e)))))
|
||||||
@@ -1,6 +1,6 @@
|
|||||||
(ns auto-ap.routes.ezcater-xls
|
(ns auto-ap.routes.ezcater-xls
|
||||||
(:require
|
(:require
|
||||||
[auto-ap.datomic :refer [audit-transact conn]]
|
[auto-ap.datomic :refer [conn]]
|
||||||
[auto-ap.logging :as alog]
|
[auto-ap.logging :as alog]
|
||||||
[clojure.data.json :as json]
|
[clojure.data.json :as json]
|
||||||
[auto-ap.parse.excel :as excel]
|
[auto-ap.parse.excel :as excel]
|
||||||
@@ -12,6 +12,7 @@
|
|||||||
[auto-ap.ssr.ui :refer [base-page]]
|
[auto-ap.ssr.ui :refer [base-page]]
|
||||||
[auto-ap.ssr.utils :refer [html-response]]
|
[auto-ap.ssr.utils :refer [html-response]]
|
||||||
[auto-ap.time :as atime]
|
[auto-ap.time :as atime]
|
||||||
|
[auto-ap.storage.parquet :as parquet]
|
||||||
[bidi.bidi :as bidi]
|
[bidi.bidi :as bidi]
|
||||||
[clj-time.coerce :as coerce]
|
[clj-time.coerce :as coerce]
|
||||||
[clojure.java.io :as io]
|
[clojure.java.io :as io]
|
||||||
@@ -54,54 +55,95 @@
|
|||||||
event-date (some-> (excel/xls-date->date event-date)
|
event-date (some-> (excel/xls-date->date event-date)
|
||||||
coerce/to-date-time
|
coerce/to-date-time
|
||||||
atime/as-local-time
|
atime/as-local-time
|
||||||
coerce/to-date )]
|
coerce/to-date)]
|
||||||
(cond (and event-date client-id location )
|
(cond (and event-date client-id location)
|
||||||
[:order #:sales-order
|
[:order #:sales-order
|
||||||
{:date event-date
|
{:date event-date
|
||||||
:external-id (str "ezcater/order/" client-id "-" location "-" order-number)
|
:external-id (str "ezcater/order/" client-id "-" location "-" order-number)
|
||||||
:client client-id
|
:client client-id
|
||||||
:location location
|
:location location
|
||||||
:reference-link (str order-number)
|
:reference-link (str order-number)
|
||||||
:line-items [#:order-line-item
|
:line-items [#:order-line-item
|
||||||
{:external-id (str "ezcater/order/" client-id "-" location "-" order-number "-" 0)
|
{:external-id (str "ezcater/order/" client-id "-" location "-" order-number "-" 0)
|
||||||
:item-name "EZCater Catering"
|
:item-name "EZCater Catering"
|
||||||
:category "EZCater Catering"
|
:category "EZCater Catering"
|
||||||
:discount (fmt-amount (or adjustments 0.0))
|
:discount (fmt-amount (or adjustments 0.0))
|
||||||
:tax (fmt-amount tax)
|
:tax (fmt-amount tax)
|
||||||
:total (fmt-amount (+ food-total
|
:total (fmt-amount (+ food-total
|
||||||
tax))}]
|
tax))}]
|
||||||
|
|
||||||
:charges [#:charge
|
:charges [#:charge
|
||||||
{:type-name "CARD"
|
{:type-name "CARD"
|
||||||
:date event-date
|
:date event-date
|
||||||
:client client-id
|
:client client-id
|
||||||
:location location
|
:location location
|
||||||
:external-id (str "ezcater/charge/" client-id "-" location "-" order-number "-" 0)
|
:external-id (str "ezcater/charge/" client-id "-" location "-" order-number "-" 0)
|
||||||
:processor :ccp-processor/ezcater
|
:processor :ccp-processor/ezcater
|
||||||
:total (fmt-amount (+ food-total
|
:total (fmt-amount (+ food-total
|
||||||
tax
|
tax
|
||||||
tip))
|
tip))
|
||||||
:tip (fmt-amount tip)}]
|
:tip (fmt-amount tip)}]
|
||||||
:total (fmt-amount (+ food-total
|
:total (fmt-amount (+ food-total
|
||||||
tax
|
tax
|
||||||
(or adjustments 0.0)))
|
(or adjustments 0.0)))
|
||||||
:discount (fmt-amount (or adjustments 0.0))
|
:discount (fmt-amount (or adjustments 0.0))
|
||||||
:service-charge (fmt-amount (+ fee commission))
|
:service-charge (fmt-amount (+ fee commission))
|
||||||
:tax (fmt-amount tax)
|
:tax (fmt-amount tax)
|
||||||
:tip (fmt-amount tip)
|
:tip (fmt-amount tip)
|
||||||
:returns 0.0
|
:returns 0.0
|
||||||
:vendor :vendor/ccp-ezcater}]
|
:vendor :vendor/ccp-ezcater}]
|
||||||
|
|
||||||
caterer-name
|
caterer-name
|
||||||
(do
|
(do
|
||||||
(alog/warn ::missing-client
|
(alog/warn ::missing-client
|
||||||
:order order-number
|
:order order-number
|
||||||
:store-name store-name
|
:store-name store-name
|
||||||
:caterer-name caterer-name)
|
:caterer-name caterer-name)
|
||||||
[:missing caterer-name])
|
[:missing caterer-name])
|
||||||
|
|
||||||
:else
|
:else
|
||||||
nil)))
|
nil)))
|
||||||
|
|
||||||
|
(defn- flatten-order-to-parquet! [order]
|
||||||
|
"Flatten a sales-order into entity-type tagged maps and buffer to parquet."
|
||||||
|
(let [so-ext-id (:sales-order/external-id order)
|
||||||
|
so-date (some-> (:sales-order/date order) .toString)
|
||||||
|
client (:sales-order/client order)
|
||||||
|
client-code (if (map? client) (:client/code client) client)]
|
||||||
|
(parquet/buffer! "sales-order"
|
||||||
|
{:entity-type "sales-order"
|
||||||
|
:external-id so-ext-id
|
||||||
|
:client-code client-code
|
||||||
|
:location (:sales-order/location order)
|
||||||
|
:vendor (:sales-order/vendor order)
|
||||||
|
:total (:sales-order/total order)
|
||||||
|
:tax (:sales-order/tax order)
|
||||||
|
:tip (:sales-order/tip order)
|
||||||
|
:discount (:sales-order/discount order)
|
||||||
|
:service-charge (:sales-order/service-charge order)
|
||||||
|
:date so-date})
|
||||||
|
(when-let [charges (:sales-order/charges order)]
|
||||||
|
(doseq [chg charges]
|
||||||
|
(parquet/buffer! "charge"
|
||||||
|
{:entity-type "charge"
|
||||||
|
:external-id (:charge/external-id chg)
|
||||||
|
:type-name (:charge/type-name chg)
|
||||||
|
:total (:charge/total chg)
|
||||||
|
:tax (:charge/tax chg)
|
||||||
|
:tip (:charge/tip chg)
|
||||||
|
:date so-date
|
||||||
|
:processor (some-> (:charge/processor chg) name)
|
||||||
|
:sales-order-external-id so-ext-id})))
|
||||||
|
(when-let [items (:sales-order/line-items order)]
|
||||||
|
(doseq [li items]
|
||||||
|
(parquet/buffer! "line-item"
|
||||||
|
{:entity-type "line-item"
|
||||||
|
:item-name (:order-line-item/item-name li)
|
||||||
|
:category (:order-line-item/category li)
|
||||||
|
:total (:order-line-item/total li)
|
||||||
|
:tax (:order-line-item/tax li)
|
||||||
|
:discount (:order-line-item/discount li)
|
||||||
|
:sales-order-external-id so-ext-id})))))
|
||||||
|
|
||||||
(defn stream->sales-orders [s]
|
(defn stream->sales-orders [s]
|
||||||
(let [clients (map first (dc/q '[:find (pull ?c [:client/code
|
(let [clients (map first (dc/q '[:find (pull ?c [:client/code
|
||||||
@@ -172,9 +214,20 @@
|
|||||||
|
|
||||||
missing-location (->> parse-results
|
missing-location (->> parse-results
|
||||||
(filter (comp #{:missing} first))
|
(filter (comp #{:missing} first))
|
||||||
(map last))]
|
(map last))
|
||||||
(audit-transact new-orders identity)
|
buffered-count (loop [orders new-orders
|
||||||
(html-response [:div (format "Successfully imported %d orders." (count new-orders))
|
count 0]
|
||||||
|
(if-let [o (first orders)]
|
||||||
|
(do
|
||||||
|
(try
|
||||||
|
(flatten-order-to-parquet! o)
|
||||||
|
(catch Exception e
|
||||||
|
(alog/error ::buffer-failed
|
||||||
|
:exception e
|
||||||
|
:order (:sales-order/external-id o))))
|
||||||
|
(recur (rest orders) (inc count)))
|
||||||
|
count))]
|
||||||
|
(html-response [:div (format "Successfully imported %d orders." buffered-count)
|
||||||
(when (seq missing-location)
|
(when (seq missing-location)
|
||||||
[:div "Missing the following locations"
|
[:div "Missing the following locations"
|
||||||
[:ul.ul
|
[:ul.ul
|
||||||
@@ -182,7 +235,7 @@
|
|||||||
[:li ml])]])]))
|
[:li ml])]])]))
|
||||||
(catch Exception e
|
(catch Exception e
|
||||||
(alog/error ::import-error
|
(alog/error ::import-error
|
||||||
:error e)
|
:error e)
|
||||||
(html-response [:div (.getMessage e)]))))))
|
(html-response [:div (.getMessage e)]))))))
|
||||||
|
|
||||||
(defn page [{:keys [matched-route request-method] :as request}]
|
(defn page [{:keys [matched-route request-method] :as request}]
|
||||||
|
|||||||
@@ -3,6 +3,7 @@
|
|||||||
[auto-ap.datomic :refer [conn remove-nils]]
|
[auto-ap.datomic :refer [conn remove-nils]]
|
||||||
[auto-ap.logging :as log :refer [capture-context->lc with-context-as]]
|
[auto-ap.logging :as log :refer [capture-context->lc with-context-as]]
|
||||||
[auto-ap.time :as atime]
|
[auto-ap.time :as atime]
|
||||||
|
[auto-ap.storage.parquet :as parquet]
|
||||||
[cemerick.url :as url]
|
[cemerick.url :as url]
|
||||||
[clj-http.client :as client]
|
[clj-http.client :as client]
|
||||||
[clj-time.coerce :as coerce]
|
[clj-time.coerce :as coerce]
|
||||||
@@ -27,11 +28,9 @@
|
|||||||
"Authorization" (str "Bearer " (:client/square-auth-token client))
|
"Authorization" (str "Bearer " (:client/square-auth-token client))
|
||||||
"Content-Type" "application/json"}))
|
"Content-Type" "application/json"}))
|
||||||
|
|
||||||
|
|
||||||
(defn ->square-date [d]
|
(defn ->square-date [d]
|
||||||
(f/unparse (f/formatter "YYYY-MM-dd'T'HH:mm:ssZZ") d))
|
(f/unparse (f/formatter "YYYY-MM-dd'T'HH:mm:ssZZ") d))
|
||||||
|
|
||||||
|
|
||||||
(def manifold-api-stream
|
(def manifold-api-stream
|
||||||
(let [stream (s/stream 100)]
|
(let [stream (s/stream 100)]
|
||||||
(->> stream
|
(->> stream
|
||||||
@@ -42,10 +41,10 @@
|
|||||||
(de/loop [attempt 0]
|
(de/loop [attempt 0]
|
||||||
(-> (de/chain (de/future-with (ex/execute-pool)
|
(-> (de/chain (de/future-with (ex/execute-pool)
|
||||||
#_(log/info ::request-started
|
#_(log/info ::request-started
|
||||||
:url (:url request)
|
:url (:url request)
|
||||||
:attempt attempt
|
:attempt attempt
|
||||||
:source "Square 3"
|
:source "Square 3"
|
||||||
:background-job "Square 3")
|
:background-job "Square 3")
|
||||||
(try
|
(try
|
||||||
(client/request (assoc request
|
(client/request (assoc request
|
||||||
:socket-timeout 10000
|
:socket-timeout 10000
|
||||||
@@ -104,7 +103,6 @@
|
|||||||
:exception error))
|
:exception error))
|
||||||
[]))))
|
[]))))
|
||||||
|
|
||||||
|
|
||||||
(def item-cache (atom {}))
|
(def item-cache (atom {}))
|
||||||
|
|
||||||
(defn fetch-catalog [client i v]
|
(defn fetch-catalog [client i v]
|
||||||
@@ -124,13 +122,11 @@
|
|||||||
#(do (swap! item-cache assoc i %)
|
#(do (swap! item-cache assoc i %)
|
||||||
%))))
|
%))))
|
||||||
|
|
||||||
|
|
||||||
(defn fetch-catalog-cache [client i version]
|
(defn fetch-catalog-cache [client i version]
|
||||||
(if (get @item-cache i)
|
(if (get @item-cache i)
|
||||||
(de/success-deferred (get @item-cache i))
|
(de/success-deferred (get @item-cache i))
|
||||||
(fetch-catalog client i version)))
|
(fetch-catalog client i version)))
|
||||||
|
|
||||||
|
|
||||||
(defn item->category-name-impl [client item version]
|
(defn item->category-name-impl [client item version]
|
||||||
(capture-context->lc
|
(capture-context->lc
|
||||||
(cond (:item_id (:item_variation_data item))
|
(cond (:item_id (:item_variation_data item))
|
||||||
@@ -161,7 +157,6 @@
|
|||||||
:item item)
|
:item item)
|
||||||
"Uncategorized"))))
|
"Uncategorized"))))
|
||||||
|
|
||||||
|
|
||||||
(defn item-id->category-name [client i version]
|
(defn item-id->category-name [client i version]
|
||||||
(capture-context->lc
|
(capture-context->lc
|
||||||
(-> [client i]
|
(-> [client i]
|
||||||
@@ -226,7 +221,6 @@
|
|||||||
(concat (:orders result) continued-results))))
|
(concat (:orders result) continued-results))))
|
||||||
(:orders result)))))))
|
(:orders result)))))))
|
||||||
|
|
||||||
|
|
||||||
(defn search
|
(defn search
|
||||||
([client location start end]
|
([client location start end]
|
||||||
(capture-context->lc
|
(capture-context->lc
|
||||||
@@ -250,11 +244,9 @@
|
|||||||
(concat (:orders result) continued-results))))
|
(concat (:orders result) continued-results))))
|
||||||
(:orders result))))))))
|
(:orders result))))))))
|
||||||
|
|
||||||
|
|
||||||
(defn amount->money [amt]
|
(defn amount->money [amt]
|
||||||
(* 0.01 (or (:amount amt) 0.0)))
|
(* 0.01 (or (:amount amt) 0.0)))
|
||||||
|
|
||||||
|
|
||||||
;; to get totals:
|
;; to get totals:
|
||||||
(comment
|
(comment
|
||||||
(reduce
|
(reduce
|
||||||
@@ -415,7 +407,6 @@
|
|||||||
:client client
|
:client client
|
||||||
:location location)))))))
|
:location location)))))))
|
||||||
|
|
||||||
|
|
||||||
(defn get-payment [client p]
|
(defn get-payment [client p]
|
||||||
(de/chain (manifold-api-call
|
(de/chain (manifold-api-call
|
||||||
{:url (str "https://connect.squareup.com/v2/payments/" p)
|
{:url (str "https://connect.squareup.com/v2/payments/" p)
|
||||||
@@ -424,7 +415,6 @@
|
|||||||
:body
|
:body
|
||||||
:payment))
|
:payment))
|
||||||
|
|
||||||
|
|
||||||
(defn continue-payout-entry-list [c l poi cursor]
|
(defn continue-payout-entry-list [c l poi cursor]
|
||||||
(capture-context->lc lc
|
(capture-context->lc lc
|
||||||
(de/chain
|
(de/chain
|
||||||
@@ -602,6 +592,57 @@
|
|||||||
(s/buffer 5)
|
(s/buffer 5)
|
||||||
(s/realize-each)
|
(s/realize-each)
|
||||||
(s/reduce conj []))))))
|
(s/reduce conj []))))))
|
||||||
|
(defn- flatten-order-to-parquet! [order]
|
||||||
|
"Flatten a sales-order into entity-type tagged maps and buffer to parquet.
|
||||||
|
Returns the sales-order external-id for logging."
|
||||||
|
(let [so-ext-id (:sales-order/external-id order)
|
||||||
|
so-date (some-> (:sales-order/date order) .toString)
|
||||||
|
client (:sales-order/client order)
|
||||||
|
client-code (when client (if (map? client)
|
||||||
|
(:client/code client)
|
||||||
|
client))]
|
||||||
|
(parquet/buffer! "sales-order"
|
||||||
|
{:entity-type "sales-order"
|
||||||
|
:external-id so-ext-id
|
||||||
|
:client-code (or client-code (:db/id client))
|
||||||
|
:location (:sales-order/location order)
|
||||||
|
:vendor (:sales-order/vendor order)
|
||||||
|
:total (:sales-order/total order)
|
||||||
|
:tax (:sales-order/tax order)
|
||||||
|
:tip (:sales-order/tip order)
|
||||||
|
:discount (:sales-order/discount order)
|
||||||
|
:service-charge (:sales-order/service-charge order)
|
||||||
|
:date so-date})
|
||||||
|
(when-let [charges (:sales-order/charges order)]
|
||||||
|
(doseq [chg charges]
|
||||||
|
(parquet/buffer! "charge"
|
||||||
|
{:entity-type "charge"
|
||||||
|
:external-id (:charge/external-id chg)
|
||||||
|
:type-name (:charge/type-name chg)
|
||||||
|
:total (:charge/total chg)
|
||||||
|
:tax (:charge/tax chg)
|
||||||
|
:tip (:charge/tip chg)
|
||||||
|
:date so-date
|
||||||
|
:processor (some-> (:charge/processor chg) name)
|
||||||
|
:sales-order-external-id so-ext-id})
|
||||||
|
(when-let [returns (:charge/returns chg)]
|
||||||
|
(doseq [rt returns]
|
||||||
|
(parquet/buffer! "sales-refund"
|
||||||
|
{:entity-type "sales-refund"
|
||||||
|
:type-name (:type-name rt)
|
||||||
|
:total (:total rt)
|
||||||
|
:sales-order-external-id so-ext-id})))))
|
||||||
|
(when-let [items (:sales-order/line-items order)]
|
||||||
|
(doseq [li items]
|
||||||
|
(parquet/buffer! "line-item"
|
||||||
|
{:entity-type "line-item"
|
||||||
|
:item-name (:order-line-item/item-name li)
|
||||||
|
:category (:order-line-item/category li)
|
||||||
|
:total (:order-line-item/total li)
|
||||||
|
:tax (:order-line-item/tax li)
|
||||||
|
:discount (:order-line-item/discount li)
|
||||||
|
:sales-order-external-id so-ext-id})))))
|
||||||
|
|
||||||
(defn upsert
|
(defn upsert
|
||||||
([client]
|
([client]
|
||||||
(apply de/zip
|
(apply de/zip
|
||||||
@@ -616,8 +657,13 @@
|
|||||||
(doseq [x (partition-all 100 results)]
|
(doseq [x (partition-all 100 results)]
|
||||||
(log/info ::loading-orders
|
(log/info ::loading-orders
|
||||||
:count (count x))
|
:count (count x))
|
||||||
@(dc/transact-async conn x))))))))
|
(doseq [order x]
|
||||||
|
(try
|
||||||
|
(flatten-order-to-parquet! order)
|
||||||
|
(catch Exception e
|
||||||
|
(log/error ::buffer-failed
|
||||||
|
:exception e
|
||||||
|
:order (:sales-order/external-id order))))))))))))
|
||||||
|
|
||||||
(defn upsert-payouts
|
(defn upsert-payouts
|
||||||
([client]
|
([client]
|
||||||
@@ -667,7 +713,6 @@
|
|||||||
|
|
||||||
(log/info ::done-loading-refunds)))))))
|
(log/info ::done-loading-refunds)))))))
|
||||||
|
|
||||||
|
|
||||||
(defn get-cash-shift [client id]
|
(defn get-cash-shift [client id]
|
||||||
(de/chain (manifold-api-call {:url (str (url/url "https://connect.squareup.com/v2/cash-drawers/shifts" id))
|
(de/chain (manifold-api-call {:url (str (url/url "https://connect.squareup.com/v2/cash-drawers/shifts" id))
|
||||||
:method :get
|
:method :get
|
||||||
@@ -826,8 +871,6 @@
|
|||||||
d1
|
d1
|
||||||
d2))
|
d2))
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
(defn remove-voided-orders
|
(defn remove-voided-orders
|
||||||
([client]
|
([client]
|
||||||
(apply de/zip
|
(apply de/zip
|
||||||
@@ -862,32 +905,28 @@
|
|||||||
(doseq [x (partition-all 100 results)]
|
(doseq [x (partition-all 100 results)]
|
||||||
(log/info ::removing-orders
|
(log/info ::removing-orders
|
||||||
:count (count x))
|
:count (count x))
|
||||||
@(dc/transact-async conn x)))))
|
@(dc/transact-async conn x)
|
||||||
(de/catch (fn [e]
|
(de/catch (fn [e]
|
||||||
(log/warn ::couldnt-remove :error e)
|
(log/warn ::couldnt-remove :error e)
|
||||||
nil) ))))))
|
nil)))))))))))
|
||||||
|
|
||||||
#_(comment
|
#_(comment
|
||||||
(require 'auto-ap.time-reader)
|
(require 'auto-ap.time-reader)
|
||||||
|
|
||||||
@(let [[c [l]] (get-square-client-and-location "DBFS") ]
|
@(let [[c [l]] (get-square-client-and-location "DBFS")]
|
||||||
(log/peek :x [ c l])
|
(log/peek :x [c l])
|
||||||
(search c l #clj-time/date-time "2026-03-28" #clj-time/date-time "2026-03-29")
|
(search c l #clj-time/date-time "2026-03-28" #clj-time/date-time "2026-03-29"))
|
||||||
|
|
||||||
)
|
@(let [[c [l]] (get-square-client-and-location "NGAK")]
|
||||||
|
(log/peek :x [c l])
|
||||||
|
|
||||||
@(let [[c [l]] (get-square-client-and-location "NGAK") ]
|
(remove-voided-orders c l #clj-time/date-time "2024-04-11" #clj-time/date-time "2024-04-15"))
|
||||||
(log/peek :x [ c l])
|
(doseq [c (get-square-clients)]
|
||||||
|
(try
|
||||||
(remove-voided-orders c l #clj-time/date-time "2024-04-11" #clj-time/date-time "2024-04-15"))
|
@(remove-voided-orders c)
|
||||||
(doseq [c (get-square-clients)]
|
(catch Exception e
|
||||||
(try
|
nil)))
|
||||||
@(remove-voided-orders c)
|
)
|
||||||
(catch Exception e
|
|
||||||
nil)))
|
|
||||||
|
|
||||||
|
|
||||||
)
|
|
||||||
|
|
||||||
(defn upsert-all [& clients]
|
(defn upsert-all [& clients]
|
||||||
(capture-context->lc
|
(capture-context->lc
|
||||||
@@ -956,8 +995,6 @@
|
|||||||
[:clients clients]
|
[:clients clients]
|
||||||
@(apply upsert-all clients)))
|
@(apply upsert-all clients)))
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
(comment
|
(comment
|
||||||
(defn refunds-raw-cont
|
(defn refunds-raw-cont
|
||||||
([client l cursor so-far]
|
([client l cursor so-far]
|
||||||
@@ -987,7 +1024,6 @@
|
|||||||
(->>
|
(->>
|
||||||
@(let [[c [l]] (get-square-client-and-location "NGGG")]
|
@(let [[c [l]] (get-square-client-and-location "NGGG")]
|
||||||
|
|
||||||
|
|
||||||
(search c l (time/now) (time/plus (time/now) (time/days -1))))
|
(search c l (time/now) (time/plus (time/now) (time/days -1))))
|
||||||
|
|
||||||
(filter (fn [r]
|
(filter (fn [r]
|
||||||
@@ -997,7 +1033,6 @@
|
|||||||
(->>
|
(->>
|
||||||
@(let [[c [l]] (get-square-client-and-location "NGGG")]
|
@(let [[c [l]] (get-square-client-and-location "NGGG")]
|
||||||
|
|
||||||
|
|
||||||
(refunds-raw-cont c l nil []))
|
(refunds-raw-cont c l nil []))
|
||||||
(filter (fn [r]
|
(filter (fn [r]
|
||||||
(str/starts-with? (:created_at r) "2024-03-14")))))
|
(str/starts-with? (:created_at r) "2024-03-14")))))
|
||||||
@@ -1032,12 +1067,7 @@
|
|||||||
[(:client/code c) (atime/unparse-local (clj-time.coerce/to-date-time (:sales-order/date bad-row)) atime/normal-date) (:sales-order/total bad-row) (:sales-order/tax bad-row) (:sales-order/tip bad-row) (:db/id bad-row)])
|
[(:client/code c) (atime/unparse-local (clj-time.coerce/to-date-time (:sales-order/date bad-row)) atime/normal-date) (:sales-order/total bad-row) (:sales-order/tax bad-row) (:sales-order/tip bad-row) (:db/id bad-row)])
|
||||||
:separator \tab)
|
:separator \tab)
|
||||||
|
|
||||||
|
;; =>
|
||||||
|
|
||||||
|
|
||||||
;; =>
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
(require 'auto-ap.time-reader)
|
(require 'auto-ap.time-reader)
|
||||||
|
|
||||||
@@ -1046,26 +1076,15 @@
|
|||||||
(clojure.pprint/pprint (let [[c [l]] (get-square-client-and-location "NGVT")]
|
(clojure.pprint/pprint (let [[c [l]] (get-square-client-and-location "NGVT")]
|
||||||
l
|
l
|
||||||
|
|
||||||
|
|
||||||
(def z @(search c l #clj-time/date-time "2025-02-23T00:00:00-08:00"
|
(def z @(search c l #clj-time/date-time "2025-02-23T00:00:00-08:00"
|
||||||
#clj-time/date-time "2025-02-28T00:00:00-08:00"))
|
#clj-time/date-time "2025-02-28T00:00:00-08:00"))
|
||||||
(take 10 (map #(first (deref (order->sales-order c l %))) z)))
|
(take 10 (map #(first (deref (order->sales-order c l %))) z))))
|
||||||
|
|
||||||
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
(->> z
|
(->> z
|
||||||
(filter (fn [o]
|
(filter (fn [o]
|
||||||
(seq (filter (comp #{"OTHER"} :type) (:tenders o)))))
|
(seq (filter (comp #{"OTHER"} :type) (:tenders o)))))
|
||||||
(filter #(not (:name (:source %))))
|
(filter #(not (:name (:source %))))
|
||||||
(count)
|
(count))
|
||||||
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
(doseq [[code] (seq (dc/q '[:find ?code
|
(doseq [[code] (seq (dc/q '[:find ?code
|
||||||
:in $
|
:in $
|
||||||
@@ -1075,32 +1094,22 @@
|
|||||||
[?o :sales-order/client ?c]
|
[?o :sales-order/client ?c]
|
||||||
[?c :client/code ?code]]
|
[?c :client/code ?code]]
|
||||||
(dc/db conn)))
|
(dc/db conn)))
|
||||||
:let [[c [l]] (get-square-client-and-location code)
|
:let [[c [l]] (get-square-client-and-location code)]
|
||||||
]
|
|
||||||
order @(search c l #clj-time/date-time "2026-01-01T00:00:00-08:00" (time/now))
|
order @(search c l #clj-time/date-time "2026-01-01T00:00:00-08:00" (time/now))
|
||||||
:when (= "Invoices" (:name (:source order) ))
|
:when (= "Invoices" (:name (:source order)))
|
||||||
:let [[sales-order] @(order->sales-order c l order)]]
|
:let [[sales-order] @(order->sales-order c l order)]]
|
||||||
|
|
||||||
(when (should-import-order? order)
|
(when (should-import-order? order)
|
||||||
(println "DATE IS" (:sales-order/date sales-order))
|
(println "DATE IS" (:sales-order/date sales-order))
|
||||||
(when (some-> (:sales-order/date sales-order) coerce/to-date-time (time/after? #clj-time/date-time "2026-2-16T00:00:00-08:00"))
|
(when (some-> (:sales-order/date sales-order) coerce/to-date-time (time/after? #clj-time/date-time "2026-2-16T00:00:00-08:00"))
|
||||||
(println "WOULD UPDATE" sales-order)
|
(println "WOULD UPDATE" sales-order)
|
||||||
@(dc/transact auto-ap.datomic/conn [sales-order])
|
@(dc/transact auto-ap.datomic/conn [sales-order]))
|
||||||
)
|
#_@(dc/transact)
|
||||||
#_@(dc/transact )
|
(println "DONE")))
|
||||||
(println "DONE"))
|
|
||||||
|
|
||||||
|
|
||||||
)
|
|
||||||
|
|
||||||
#_(filter (comp #{"OTHER"} :type) (mapcat :tenders z))
|
#_(filter (comp #{"OTHER"} :type) (mapcat :tenders z))
|
||||||
|
|
||||||
|
|
||||||
@(let [[c [l]] (get-square-client-and-location "NGRY")]
|
@(let [[c [l]] (get-square-client-and-location "NGRY")]
|
||||||
#_(search c l (clj-time.coerce/from-date #inst "2025-02-28") (clj-time.coerce/from-date #inst "2025-03-01"))
|
#_(search c l (clj-time.coerce/from-date #inst "2025-02-28") (clj-time.coerce/from-date #inst "2025-03-01"))
|
||||||
|
|
||||||
(order->sales-order c l (:order (get-order c l "KdvwntmfMNTKBu8NOocbxatOs18YY" )))
|
(order->sales-order c l (:order (get-order c l "KdvwntmfMNTKBu8NOocbxatOs18YY")))))
|
||||||
|
|
||||||
)
|
|
||||||
|
|
||||||
)
|
|
||||||
|
|||||||
@@ -104,19 +104,18 @@
|
|||||||
:size :small})])
|
:size :small})])
|
||||||
(com/field {:label "Payment Type"}
|
(com/field {:label "Payment Type"}
|
||||||
(com/radio-card {:size :small
|
(com/radio-card {:size :small
|
||||||
:name "payment-type"
|
:name "payment-type"
|
||||||
:value (:payment-type (:query-params request))
|
:value (:payment-type (:query-params request))
|
||||||
:options [{:value ""
|
:options [{:value ""
|
||||||
:content "All"}
|
:content "All"}
|
||||||
{:value "cash"
|
{:value "cash"
|
||||||
:content "Cash"}
|
:content "Cash"}
|
||||||
{:value "check"
|
{:value "check"
|
||||||
:content "Check"}
|
:content "Check"}
|
||||||
{:value "debit"
|
{:value "debit"
|
||||||
:content "Debit"}]}))
|
:content "Debit"}]}))
|
||||||
(exact-match-id* request)]])
|
(exact-match-id* request)]])
|
||||||
|
|
||||||
|
|
||||||
(def default-read '[*
|
(def default-read '[*
|
||||||
[:payment/date :xform clj-time.coerce/from-date]
|
[:payment/date :xform clj-time.coerce/from-date]
|
||||||
{:invoice-payment/_payment [* {:invoice-payment/invoice [*]}]}
|
{:invoice-payment/_payment [* {:invoice-payment/invoice [*]}]}
|
||||||
@@ -212,7 +211,6 @@
|
|||||||
'[(iol-ion.query/dollars= ?transaction-amount ?amount)]]}
|
'[(iol-ion.query/dollars= ?transaction-amount ?amount)]]}
|
||||||
:args [(:amount query-params)]})
|
:args [(:amount query-params)]})
|
||||||
|
|
||||||
|
|
||||||
(:status route-params)
|
(:status route-params)
|
||||||
(merge-query {:query {:in ['?status]
|
(merge-query {:query {:in ['?status]
|
||||||
:where ['[?e :payment/status ?status]]}
|
:where ['[?e :payment/status ?status]]}
|
||||||
@@ -244,12 +242,12 @@
|
|||||||
|
|
||||||
(defn sum-visible-pending [ids]
|
(defn sum-visible-pending [ids]
|
||||||
(->>
|
(->>
|
||||||
(dc/q {:find ['?id '?o]
|
(dc/q {:find ['?id '?o]
|
||||||
:in ['$ '[?id ...]]
|
:in ['$ '[?id ...]]
|
||||||
:where ['[?id :payment/amount ?o]
|
:where ['[?id :payment/amount ?o]
|
||||||
'[?id :payment/status :payment-status/pending]]}
|
'[?id :payment/status :payment-status/pending]]}
|
||||||
(dc/db conn)
|
(dc/db conn)
|
||||||
ids)
|
ids)
|
||||||
(map last)
|
(map last)
|
||||||
(reduce
|
(reduce
|
||||||
+
|
+
|
||||||
@@ -257,15 +255,15 @@
|
|||||||
|
|
||||||
(defn sum-client-pending [clients]
|
(defn sum-client-pending [clients]
|
||||||
(->>
|
(->>
|
||||||
(dc/q {:find '[?e ?a]
|
(dc/q {:find '[?e ?a]
|
||||||
:in '[$ [?clients ?start ?end]]
|
:in '[$ [?clients ?start ?end]]
|
||||||
:where '[[(iol-ion.query/scan-payments $ ?clients ?start ?end) [[?e _ ?sort-default] ...]]
|
:where '[[(iol-ion.query/scan-payments $ ?clients ?start ?end) [[?e _ ?sort-default] ...]]
|
||||||
[?e :payment/status :payment-status/pending]
|
[?e :payment/status :payment-status/pending]
|
||||||
[?e :payment/amount ?a]]}
|
[?e :payment/amount ?a]]}
|
||||||
(dc/db conn)
|
(dc/db conn)
|
||||||
[clients
|
[clients
|
||||||
nil
|
nil
|
||||||
nil])
|
nil])
|
||||||
|
|
||||||
(map last)
|
(map last)
|
||||||
(reduce
|
(reduce
|
||||||
@@ -277,16 +275,14 @@
|
|||||||
{ids-to-retrieve :ids matching-count :count
|
{ids-to-retrieve :ids matching-count :count
|
||||||
all-ids :all-ids} (fetch-ids db request)]
|
all-ids :all-ids} (fetch-ids db request)]
|
||||||
|
|
||||||
|
|
||||||
[(->> (hydrate-results ids-to-retrieve db request))
|
[(->> (hydrate-results ids-to-retrieve db request))
|
||||||
matching-count
|
matching-count
|
||||||
(sum-visible-pending all-ids)
|
(sum-visible-pending all-ids)
|
||||||
(sum-client-pending (extract-client-ids (:clients request)
|
(sum-client-pending (extract-client-ids (:clients request)
|
||||||
(:client request)
|
(:client request)
|
||||||
(:client-id (:query-params request))
|
(:client-id (:query-params request))
|
||||||
(when (:client-code (:query-params request))
|
(when (:client-code (:query-params request))
|
||||||
[:client/code (:client-code (:query-params request))])))
|
[:client/code (:client-code (:query-params request))])))]))
|
||||||
]))
|
|
||||||
|
|
||||||
(def query-schema (mc/schema
|
(def query-schema (mc/schema
|
||||||
[:maybe [:map {:date-range [:date-range :start-date :end-date]}
|
[:maybe [:map {:date-range [:date-range :start-date :end-date]}
|
||||||
@@ -327,7 +323,7 @@
|
|||||||
(assoc-in (exact-match-id* request) [1 :hx-swap-oob] true)])
|
(assoc-in (exact-match-id* request) [1 :hx-swap-oob] true)])
|
||||||
:query-schema query-schema
|
:query-schema query-schema
|
||||||
:action-buttons (fn [request]
|
:action-buttons (fn [request]
|
||||||
(let [[_ _ visible-in-float total-in-float ] (:page-results request)]
|
(let [[_ _ visible-in-float total-in-float] (:page-results request)]
|
||||||
[(com/pill {:color :primary} " Visible in float "
|
[(com/pill {:color :primary} " Visible in float "
|
||||||
(format "$%,.2f" visible-in-float))
|
(format "$%,.2f" visible-in-float))
|
||||||
(com/pill {:color :secondary} " Total in float "
|
(com/pill {:color :secondary} " Total in float "
|
||||||
@@ -409,7 +405,7 @@
|
|||||||
:render (fn [{:payment/keys [date]}]
|
:render (fn [{:payment/keys [date]}]
|
||||||
(some-> date (atime/unparse-local atime/normal-date)))}
|
(some-> date (atime/unparse-local atime/normal-date)))}
|
||||||
{:key "amount"
|
{:key "amount"
|
||||||
:sort-key "amount"
|
:sort-key "amount"
|
||||||
:name "Amount"
|
:name "Amount"
|
||||||
:render (fn [{:payment/keys [amount]}]
|
:render (fn [{:payment/keys [amount]}]
|
||||||
(some->> amount (format "$%.2f")))}
|
(some->> amount (format "$%.2f")))}
|
||||||
@@ -421,10 +417,10 @@
|
|||||||
(map :invoice-payment/invoice)
|
(map :invoice-payment/invoice)
|
||||||
(filter identity)
|
(filter identity)
|
||||||
(map (fn [invoice]
|
(map (fn [invoice]
|
||||||
{:link (hu/url (bidi/path-for ssr-routes/only-routes
|
{:link (hu/url (bidi/path-for ssr-routes/only-routes
|
||||||
::invoice-route/all-page)
|
::invoice-route/all-page)
|
||||||
{:exact-match-id (:db/id invoice)})
|
{:exact-match-id (:db/id invoice)})
|
||||||
:content (str "Inv. " (:invoice/invoice-number invoice))})))
|
:content (str "Inv. " (:invoice/invoice-number invoice))})))
|
||||||
(some-> p :transaction/_payment ((fn [t]
|
(some-> p :transaction/_payment ((fn [t]
|
||||||
[{:link (hu/url (bidi/path-for client-routes/routes
|
[{:link (hu/url (bidi/path-for client-routes/routes
|
||||||
:transactions)
|
:transactions)
|
||||||
@@ -434,8 +430,6 @@
|
|||||||
|
|
||||||
(def row* (partial helper/row* grid-page))
|
(def row* (partial helper/row* grid-page))
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
(comment
|
(comment
|
||||||
(mc/decode query-schema {"exact-match-id" "123"} (mt/transformer main-transformer mt/strip-extra-keys-transformer))
|
(mc/decode query-schema {"exact-match-id" "123"} (mt/transformer main-transformer mt/strip-extra-keys-transformer))
|
||||||
(mc/decode query-schema {} (mt/transformer main-transformer mt/strip-extra-keys-transformer))
|
(mc/decode query-schema {} (mt/transformer main-transformer mt/strip-extra-keys-transformer))
|
||||||
@@ -445,7 +439,6 @@
|
|||||||
(mc/decode query-schema {"payment-type" "food"} (mt/transformer main-transformer mt/strip-extra-keys-transformer))
|
(mc/decode query-schema {"payment-type" "food"} (mt/transformer main-transformer mt/strip-extra-keys-transformer))
|
||||||
(mc/decode query-schema {"vendor" "87"} (mt/transformer main-transformer mt/strip-extra-keys-transformer))
|
(mc/decode query-schema {"vendor" "87"} (mt/transformer main-transformer mt/strip-extra-keys-transformer))
|
||||||
|
|
||||||
|
|
||||||
(mc/decode query-schema {"start-date" #inst "2023-12-21T08:00:00.000-00:00"} (mt/transformer main-transformer mt/strip-extra-keys-transformer)))
|
(mc/decode query-schema {"start-date" #inst "2023-12-21T08:00:00.000-00:00"} (mt/transformer main-transformer mt/strip-extra-keys-transformer)))
|
||||||
|
|
||||||
(defn delete [{check :entity :as request identity :identity}]
|
(defn delete [{check :entity :as request identity :identity}]
|
||||||
@@ -459,7 +452,7 @@
|
|||||||
#(assert-can-see-client identity (:db/id (:payment/client check))))
|
#(assert-can-see-client identity (:db/id (:payment/client check))))
|
||||||
(notify-if-locked (:db/id (:payment/client check))
|
(notify-if-locked (:db/id (:payment/client check))
|
||||||
(:payment/date check))
|
(:payment/date check))
|
||||||
(let [ removing-payments (mapcat (fn [x]
|
(let [removing-payments (mapcat (fn [x]
|
||||||
(let [invoice (:invoice-payment/invoice x)
|
(let [invoice (:invoice-payment/invoice x)
|
||||||
new-balance (+ (:invoice/outstanding-balance invoice)
|
new-balance (+ (:invoice/outstanding-balance invoice)
|
||||||
(:invoice-payment/amount x))]
|
(:invoice-payment/amount x))]
|
||||||
@@ -578,7 +571,6 @@
|
|||||||
(assoc-in [:query-params :start] 0)
|
(assoc-in [:query-params :start] 0)
|
||||||
(assoc-in [:query-params :per-page] 250))))
|
(assoc-in [:query-params :per-page] 250))))
|
||||||
|
|
||||||
|
|
||||||
:else
|
:else
|
||||||
selected)
|
selected)
|
||||||
updated-count (void-payments-internal ids (:identity request))]
|
updated-count (void-payments-internal ids (:identity request))]
|
||||||
@@ -591,7 +583,7 @@
|
|||||||
|
|
||||||
(defn wrap-status-from-source [handler]
|
(defn wrap-status-from-source [handler]
|
||||||
(fn [{:keys [matched-current-page-route] :as request}]
|
(fn [{:keys [matched-current-page-route] :as request}]
|
||||||
(let [ request (cond-> request
|
(let [request (cond-> request
|
||||||
(= ::route/cleared-page matched-current-page-route) (assoc-in [:route-params :status] :payment-status/cleared)
|
(= ::route/cleared-page matched-current-page-route) (assoc-in [:route-params :status] :payment-status/cleared)
|
||||||
(= ::route/pending-page matched-current-page-route) (assoc-in [:route-params :status] :payment-status/pending)
|
(= ::route/pending-page matched-current-page-route) (assoc-in [:route-params :status] :payment-status/pending)
|
||||||
(= ::route/voided-page matched-current-page-route) (assoc-in [:route-params :status] :payment-status/voided)
|
(= ::route/voided-page matched-current-page-route) (assoc-in [:route-params :status] :payment-status/voided)
|
||||||
@@ -605,7 +597,7 @@
|
|||||||
::route/pending-page (-> (helper/page-route grid-page)
|
::route/pending-page (-> (helper/page-route grid-page)
|
||||||
(wrap-implied-route-param :status :payment-status/pending))
|
(wrap-implied-route-param :status :payment-status/pending))
|
||||||
::route/voided-page (-> (helper/page-route grid-page)
|
::route/voided-page (-> (helper/page-route grid-page)
|
||||||
(wrap-implied-route-param :status :payment-status/voided))
|
(wrap-implied-route-param :status :payment-status/voided))
|
||||||
::route/all-page (-> (helper/page-route grid-page)
|
::route/all-page (-> (helper/page-route grid-page)
|
||||||
(wrap-implied-route-param :status nil))
|
(wrap-implied-route-param :status nil))
|
||||||
|
|
||||||
@@ -618,7 +610,6 @@
|
|||||||
::route/bulk-delete (-> bulk-delete-dialog
|
::route/bulk-delete (-> bulk-delete-dialog
|
||||||
(wrap-admin))
|
(wrap-admin))
|
||||||
|
|
||||||
|
|
||||||
::route/table (helper/table-route grid-page)}
|
::route/table (helper/table-route grid-page)}
|
||||||
(fn [h]
|
(fn [h]
|
||||||
(-> h
|
(-> h
|
||||||
|
|||||||
@@ -1,7 +1,7 @@
|
|||||||
(ns auto-ap.ssr.pos.sales-orders
|
(ns auto-ap.ssr.pos.sales-orders
|
||||||
(:require
|
(:require
|
||||||
[auto-ap.datomic
|
[auto-ap.datomic
|
||||||
:refer [add-sorter-fields apply-pagination apply-sort-3 conn merge-query
|
:refer [add-sorter-fields apply-pagination apply-sort-3 merge-query
|
||||||
pull-many query2]]
|
pull-many query2]]
|
||||||
[auto-ap.datomic.sales-orders :as d-sales]
|
[auto-ap.datomic.sales-orders :as d-sales]
|
||||||
[auto-ap.query-params :as query-params :refer [wrap-copy-qp-pqp]]
|
[auto-ap.query-params :as query-params :refer [wrap-copy-qp-pqp]]
|
||||||
@@ -17,7 +17,6 @@
|
|||||||
[auto-ap.time :as atime]
|
[auto-ap.time :as atime]
|
||||||
[bidi.bidi :as bidi]
|
[bidi.bidi :as bidi]
|
||||||
[clj-time.coerce :as c]
|
[clj-time.coerce :as c]
|
||||||
[datomic.api :as dc]
|
|
||||||
[malli.core :as mc]))
|
[malli.core :as mc]))
|
||||||
|
|
||||||
(def query-schema (mc/schema
|
(def query-schema (mc/schema
|
||||||
@@ -172,11 +171,8 @@
|
|||||||
charges))
|
charges))
|
||||||
|
|
||||||
(defn fetch-page [request]
|
(defn fetch-page [request]
|
||||||
(let [db (dc/db conn)
|
(let [{:keys [rows count]} (d-sales/fetch-page-ssr request)]
|
||||||
{ids-to-retrieve :ids matching-count :count} (fetch-ids db request)]
|
[rows count]))
|
||||||
|
|
||||||
[(->> (hydrate-results ids-to-retrieve db request))
|
|
||||||
matching-count]))
|
|
||||||
|
|
||||||
|
|
||||||
(def grid-page
|
(def grid-page
|
||||||
@@ -200,13 +196,13 @@
|
|||||||
:title "Sales orders"
|
:title "Sales orders"
|
||||||
:entity-name "Sales orders"
|
:entity-name "Sales orders"
|
||||||
:route :pos-sales-table
|
:route :pos-sales-table
|
||||||
:action-buttons (fn [request]
|
:action-buttons (fn [request]
|
||||||
(let [{:keys [total tax]} (d-sales/summarize-orders (:ids (fetch-ids (dc/db conn) request)))]
|
(let [{:keys [total tax]} (d-sales/summarize-page-ssr request)]
|
||||||
(when (and total tax)
|
(when (and total tax)
|
||||||
[(com/pill {:color :primary}
|
[(com/pill {:color :primary}
|
||||||
(format "Total $%.2f" total))
|
(format "Total $%.2f" total))
|
||||||
(com/pill {:color :secondary}
|
(com/pill {:color :secondary}
|
||||||
(format "Tax $%.2f" tax))])))
|
(format "Tax $%.2f" tax))])))
|
||||||
:row-buttons (fn [_ e]
|
:row-buttons (fn [_ e]
|
||||||
(when (:sales-order/reference-link e)
|
(when (:sales-order/reference-link e)
|
||||||
[(com/a-icon-button {:href (:sales-order/reference-link e)}
|
[(com/a-icon-button {:href (:sales-order/reference-link e)}
|
||||||
|
|||||||
373
src/clj/auto_ap/storage/parquet.clj
Normal file
373
src/clj/auto_ap/storage/parquet.clj
Normal file
@@ -0,0 +1,373 @@
|
|||||||
|
(ns auto-ap.storage.parquet
|
||||||
|
(:require [config.core :refer [env]]
|
||||||
|
[amazonica.aws.s3 :as s3]
|
||||||
|
[clojure.java.io :as io]
|
||||||
|
[clojure.string :as str]
|
||||||
|
[clojure.data.json :as json])
|
||||||
|
(:import (java.sql DriverManager)
|
||||||
|
(java.time LocalDate)))
|
||||||
|
|
||||||
|
(def ^:dynamic *bucket* (:data-bucket env))
|
||||||
|
(def parquet-prefix "sales-details")
|
||||||
|
|
||||||
|
(defn s3-location [filename]
|
||||||
|
(str "s3://" *bucket* "/" filename))
|
||||||
|
|
||||||
|
(defn parquet-key [entity-type date-str]
|
||||||
|
(str parquet-prefix "/" entity-type "/" date-str ".parquet"))
|
||||||
|
|
||||||
|
(def db (atom nil))
|
||||||
|
|
||||||
|
(defn connect! []
|
||||||
|
(let [conn (DriverManager/getConnection "jdbc:duckdb:")
|
||||||
|
stmt (.createStatement conn)]
|
||||||
|
(.execute stmt "INSTALL httpfs; LOAD httpfs;")
|
||||||
|
(when-let [key (:aws-access-key-id env)]
|
||||||
|
(.execute stmt (str "SET s3_access_key_id='" key "'"))
|
||||||
|
(.execute stmt (str "SET s3_secret_access_key='" (:aws-secret-access-key env) "'"))
|
||||||
|
(.execute stmt (str "SET s3_region='" (or (:aws-region env) "us-east-1") "'")))
|
||||||
|
(.close stmt)
|
||||||
|
(.addShutdownHook (Runtime/getRuntime)
|
||||||
|
(Thread. #(fn [])))
|
||||||
|
(reset! db conn)))
|
||||||
|
|
||||||
|
(defn disconnect! []
|
||||||
|
(locking db
|
||||||
|
(when-let [c @db]
|
||||||
|
(.close c)
|
||||||
|
(reset! db nil))))
|
||||||
|
|
||||||
|
(defmacro with-duckdb
|
||||||
|
[& body]
|
||||||
|
`(let [conn# (or @db (connect!))]
|
||||||
|
(try
|
||||||
|
(let [~'conn conn#]
|
||||||
|
~@body)
|
||||||
|
(finally
|
||||||
|
(when (and (not @db) conn#)
|
||||||
|
(.close conn#))))))
|
||||||
|
|
||||||
|
(defn execute! [sql]
|
||||||
|
(with-duckdb
|
||||||
|
(let [stmt (.createStatement conn)]
|
||||||
|
(.execute stmt sql)
|
||||||
|
nil)))
|
||||||
|
|
||||||
|
(defn query-scalar [sql]
|
||||||
|
(with-duckdb
|
||||||
|
(let [stmt (.createStatement conn)
|
||||||
|
rs (.executeQuery stmt sql)]
|
||||||
|
(when (.next rs)
|
||||||
|
(.getObject rs 1)))))
|
||||||
|
|
||||||
|
(defn query-rows [sql]
|
||||||
|
(with-duckdb
|
||||||
|
(let [stmt (.createStatement conn)
|
||||||
|
rs (.executeQuery stmt sql)
|
||||||
|
meta (.getMetaData rs)
|
||||||
|
col-count (.getColumnCount meta)
|
||||||
|
cols (vec (for [i (range 1 (inc col-count))]
|
||||||
|
(keyword (.getColumnLabel meta i))))]
|
||||||
|
(loop [rows []]
|
||||||
|
(if (.next rs)
|
||||||
|
(recur (conj rows
|
||||||
|
(zipmap cols
|
||||||
|
(vec (for [i (range 1 (inc col-count))]
|
||||||
|
(.getObject rs i))))))
|
||||||
|
rows)))))
|
||||||
|
|
||||||
|
(defn execute-to-parquet! [sql ^String parquet-path]
|
||||||
|
(with-duckdb
|
||||||
|
(let [stmt (.createStatement conn)]
|
||||||
|
(.execute stmt
|
||||||
|
(format "COPY (%s) TO '%s' (FORMAT PARQUET, OVERWRITE_OR_IGNORE)"
|
||||||
|
sql parquet-path))
|
||||||
|
(io/file parquet-path))))
|
||||||
|
|
||||||
|
(defn upload-parquet! [local-parquet-file s3-key]
|
||||||
|
(s3/put-object {:bucket-name *bucket*
|
||||||
|
:key s3-key
|
||||||
|
:file local-parquet-file})
|
||||||
|
(s3-location s3-key))
|
||||||
|
|
||||||
|
(defonce *buffers* (atom {}))
|
||||||
|
|
||||||
|
(defn- wal-dir []
|
||||||
|
(io/file (System/getProperty "user.dir" "/tmp")
|
||||||
|
"parquet-wal"))
|
||||||
|
|
||||||
|
(defn- init-wal! []
|
||||||
|
(let [dir (wal-dir)]
|
||||||
|
(when-not (.exists dir)
|
||||||
|
(.mkdirs dir))))
|
||||||
|
|
||||||
|
(defn buffer! [entity-type record]
|
||||||
|
(init-wal!)
|
||||||
|
(let [seq-no (System/currentTimeMillis)
|
||||||
|
entry (assoc record :_seq-no seq-no)]
|
||||||
|
(swap! *buffers* update entity-type (fnil conj []) entry)
|
||||||
|
(try
|
||||||
|
(let [wal-file (io/file (wal-dir)
|
||||||
|
(str entity-type ".jsonl"))]
|
||||||
|
(io/make-parents wal-file)
|
||||||
|
(with-open [w (io/writer wal-file :append true)]
|
||||||
|
(.write w ^String (json/write-str {:seq-no seq-no
|
||||||
|
:record record}))
|
||||||
|
(.write w (int \newline))))
|
||||||
|
(catch Exception e
|
||||||
|
(println "[parquet/wal]" (.getMessage e))))
|
||||||
|
entry))
|
||||||
|
|
||||||
|
(defn clear-buffer! [entity-type]
|
||||||
|
(swap! *buffers* dissoc entity-type))
|
||||||
|
|
||||||
|
(defn buffer-count [entity-type]
|
||||||
|
(-> @*buffers* (get entity-type []) count))
|
||||||
|
|
||||||
|
(defn total-buf-count []
|
||||||
|
(->> @*buffers*
|
||||||
|
vals (mapcat identity) count))
|
||||||
|
|
||||||
|
(defn flush-to-parquet! [entity-type date-str]
|
||||||
|
"Flush buffered records for entity-type to parquet + S3."
|
||||||
|
(let [records (get @*buffers* entity-type [])]
|
||||||
|
(if (empty? records)
|
||||||
|
{:status :no-records}
|
||||||
|
(let [date-str (or date-str (.toString (LocalDate/now)))
|
||||||
|
jsonl-file (io/file "/tmp"
|
||||||
|
(str entity-type "-" date-str ".jsonl"))
|
||||||
|
parquet-file (io/file "/tmp"
|
||||||
|
(str entity-type "-" date-str ".parquet"))
|
||||||
|
s3-key (parquet-key entity-type date-str)]
|
||||||
|
(try
|
||||||
|
(with-open [w (io/writer jsonl-file :append true)]
|
||||||
|
(doseq [r records]
|
||||||
|
(.write w ^String (json/write-str (dissoc r :_seq-no)))
|
||||||
|
(.write w (int \newline))))
|
||||||
|
(execute-to-parquet!
|
||||||
|
(format "SELECT * FROM read_json_auto('%s')"
|
||||||
|
(.getAbsolutePath jsonl-file))
|
||||||
|
(.getAbsolutePath parquet-file))
|
||||||
|
(upload-parquet! parquet-file s3-key)
|
||||||
|
(clear-buffer! entity-type)
|
||||||
|
(.delete ^java.io.File jsonl-file)
|
||||||
|
(.delete ^java.io.File parquet-file)
|
||||||
|
{:key s3-key :status :ok}
|
||||||
|
(catch Exception e
|
||||||
|
(throw (ex-info "Flush failed"
|
||||||
|
{:entity-type entity-type
|
||||||
|
:error (.getMessage e)}))))))))
|
||||||
|
|
||||||
|
(defn flush-by-date! []
|
||||||
|
"Flush all entity types for today."
|
||||||
|
(let [etypes ["sales-order" "charge"
|
||||||
|
"line-item" "sales-refund"]
|
||||||
|
today (.toString (LocalDate/now))
|
||||||
|
flushed (into #{}
|
||||||
|
(keep (fn [et]
|
||||||
|
(let [{:keys [status]}
|
||||||
|
(flush-to-parquet! et today)]
|
||||||
|
(when (= status :ok)
|
||||||
|
et))))
|
||||||
|
etypes)]
|
||||||
|
{:flushed flushed}))
|
||||||
|
|
||||||
|
(defn load-unflushed! []
|
||||||
|
"Restore unflushed records from WAL jsonl files into *buffers."
|
||||||
|
(init-wal!)
|
||||||
|
(let [etypes ["sales-order" "charge"
|
||||||
|
"line-item" "sales-refund"]
|
||||||
|
loaded (reduce-kv
|
||||||
|
(fn [acc et data]
|
||||||
|
(if-not (empty? data)
|
||||||
|
(assoc acc et
|
||||||
|
(->> (str/split-lines data)
|
||||||
|
(keep #(try
|
||||||
|
(let [entry (json/read-str %)]
|
||||||
|
(when entry
|
||||||
|
(assoc (:record entry) :_seq-no (:seq-no entry))))
|
||||||
|
(catch Exception _)))))
|
||||||
|
acc))
|
||||||
|
{}
|
||||||
|
(into {}
|
||||||
|
(keep (fn [et]
|
||||||
|
(let [f (io/file
|
||||||
|
(wal-dir)
|
||||||
|
(str et ".jsonl"))]
|
||||||
|
(when (.exists f)
|
||||||
|
[et (slurp f)])))
|
||||||
|
etypes)))]
|
||||||
|
(swap! *buffers* merge loaded)))
|
||||||
|
|
||||||
|
(defn get-unflushed-count []
|
||||||
|
(total-buf-count))
|
||||||
|
|
||||||
|
(defn unflushed-records? []
|
||||||
|
(not= 0 (total-buf-count)))
|
||||||
|
|
||||||
|
;;; DuckDB Read Layer
|
||||||
|
|
||||||
|
(defn date-seq [start end]
|
||||||
|
"Seq of YYYY-MM-DD strings between start and end inclusive."
|
||||||
|
(let [sd (LocalDate/parse start)
|
||||||
|
ed (LocalDate/parse end)
|
||||||
|
days (int (Math/abs
|
||||||
|
(- (.toEpochDay sd)
|
||||||
|
(.toEpochDay ed))))]
|
||||||
|
(for [i (range 0 (inc days))]
|
||||||
|
(.toString (.plusDays sd i)))))
|
||||||
|
|
||||||
|
(defn today []
|
||||||
|
(.toString (LocalDate/now)))
|
||||||
|
|
||||||
|
(defn- parquet-glob [entity-type start-date end-date]
|
||||||
|
"Build a glob pattern or explicit file list for the date range.
|
||||||
|
Uses glob patterns for ranges > 60 days; explicit list otherwise."
|
||||||
|
(let [days (-> (LocalDate/parse end-date)
|
||||||
|
(.toEpochDay)
|
||||||
|
(- (.toEpochDay (LocalDate/parse start-date)))
|
||||||
|
inc)]
|
||||||
|
(if (> days 60)
|
||||||
|
(let [prefix (format "s3://%s/sales-details/%s/" *bucket* entity-type)
|
||||||
|
sy (-> (LocalDate/parse start-date) .getYear)
|
||||||
|
ey (-> (LocalDate/parse end-date) .getYear)]
|
||||||
|
(if (= sy ey)
|
||||||
|
[(format "%s%d-*.parquet" prefix sy)]
|
||||||
|
(vec
|
||||||
|
(for [y (range sy (inc ey))]
|
||||||
|
(format "%s%d-*.parquet" prefix y)))))
|
||||||
|
(vec
|
||||||
|
(map (fn [d]
|
||||||
|
(format "'s3://%s/sales-details/%s/%s.parquet'"
|
||||||
|
*bucket* entity-type d))
|
||||||
|
(date-seq start-date end-date))))))
|
||||||
|
|
||||||
|
(defn parquet-query [entity-type start-date end-date]
|
||||||
|
"Build SQL to read all parquet files in date range.
|
||||||
|
Returns map with :sql and :count-sql keys."
|
||||||
|
(let [globs (parquet-glob entity-type start-date end-date)
|
||||||
|
use-glob? (some #(.endsWith ^String % "*.parquet") globs)
|
||||||
|
base (if use-glob?
|
||||||
|
(format "SELECT * FROM read_parquet(%s, union_by_name=true)"
|
||||||
|
(if (= (count globs) 1)
|
||||||
|
(format "'%s'" (first globs))
|
||||||
|
(format "[%s]"
|
||||||
|
(str/join ", " (map #(format "'%s'" %) globs)))))
|
||||||
|
(format "SELECT * FROM read_parquet([%s])"
|
||||||
|
(str/join ", " globs)))
|
||||||
|
add-date-filter (fn [sql]
|
||||||
|
(if (> (-> (LocalDate/parse end-date)
|
||||||
|
(.toEpochDay)
|
||||||
|
(- (.toEpochDay (LocalDate/parse start-date)))
|
||||||
|
inc)
|
||||||
|
60)
|
||||||
|
(format "%s WHERE date >= '%s' AND date <= '%s'"
|
||||||
|
sql start-date end-date)
|
||||||
|
sql))
|
||||||
|
sql (add-date-filter base)]
|
||||||
|
{:sql sql
|
||||||
|
:count-sql (format "SELECT COUNT(*) FROM (%s) t" sql)}))
|
||||||
|
|
||||||
|
(defn- like-clause [col v]
|
||||||
|
(str "\"" col "\" LIKE '%" v "%'"))
|
||||||
|
|
||||||
|
(defn- build-sales-orders-where [opts]
|
||||||
|
(let [eq-clauses (keep
|
||||||
|
(fn [[key col]]
|
||||||
|
(let [v (get opts key)]
|
||||||
|
(when v
|
||||||
|
(str "\"" col "\" = '" v "'"))))
|
||||||
|
[[:client "client-code"]
|
||||||
|
[:vendor "vendor"]
|
||||||
|
[:location "location"]])
|
||||||
|
like-clauses (keep
|
||||||
|
(fn [[key col]]
|
||||||
|
(let [v (get opts key)]
|
||||||
|
(when v
|
||||||
|
(like-clause col v))))
|
||||||
|
[[:payment-method "payment-methods"]
|
||||||
|
[:processor "processors"]
|
||||||
|
[:category "categories"]])
|
||||||
|
range-clauses (keep
|
||||||
|
(fn [[key col op]]
|
||||||
|
(let [v (get opts key)]
|
||||||
|
(when v
|
||||||
|
(str "\"" col "\" " op " " v))))
|
||||||
|
[[:total-gte "total" ">="]
|
||||||
|
[:total-lte "total" "<="]])
|
||||||
|
all-clauses (concat eq-clauses like-clauses range-clauses)]
|
||||||
|
(when (seq all-clauses)
|
||||||
|
(str " WHERE " (str/join " AND " all-clauses)))))
|
||||||
|
|
||||||
|
(defn get-sales-orders
|
||||||
|
([start-date end-date]
|
||||||
|
(get-sales-orders start-date end-date {}))
|
||||||
|
([start-date end-date opts]
|
||||||
|
(try
|
||||||
|
(let [q (parquet-query "sales-order"
|
||||||
|
start-date end-date)
|
||||||
|
base-sql (:sql q)
|
||||||
|
count-sql (:count-sql q)
|
||||||
|
sort (get opts :sort "date")
|
||||||
|
order (get opts :order "DESC")
|
||||||
|
limit (get opts :limit)
|
||||||
|
offset (get opts :offset)
|
||||||
|
where-str (build-sales-orders-where opts)
|
||||||
|
full-sql (if where-str
|
||||||
|
(str base-sql where-str)
|
||||||
|
base-sql)
|
||||||
|
result (cond-> full-sql
|
||||||
|
sort (str " ORDER BY " sort
|
||||||
|
" " (name order))
|
||||||
|
limit (str " LIMIT " limit)
|
||||||
|
offset (str " OFFSET " offset))
|
||||||
|
full-count (if where-str
|
||||||
|
(str count-sql where-str)
|
||||||
|
count-sql)]
|
||||||
|
{:rows (query-rows result)
|
||||||
|
:count (or
|
||||||
|
(int
|
||||||
|
(query-scalar
|
||||||
|
full-count)) 0)})
|
||||||
|
(catch Exception _
|
||||||
|
{:rows [] :count 0}))))
|
||||||
|
|
||||||
|
(defn get-sales-orders-summary
|
||||||
|
([start-date end-date]
|
||||||
|
(get-sales-orders-summary start-date end-date {}))
|
||||||
|
([start-date end-date opts]
|
||||||
|
(try
|
||||||
|
(let [q (parquet-query "sales-order" start-date end-date)
|
||||||
|
base-sql (:sql q)
|
||||||
|
where-str (build-sales-orders-where opts)
|
||||||
|
full-sql (if where-str
|
||||||
|
(str base-sql where-str)
|
||||||
|
base-sql)
|
||||||
|
sum-sql (format "SELECT COALESCE(SUM(total), 0) as total, COALESCE(SUM(tax), 0) as tax FROM (%s) t" full-sql)
|
||||||
|
row (first (query-rows sum-sql))]
|
||||||
|
{:total (or (:total row) 0.0)
|
||||||
|
:tax (or (:tax row) 0.0)})
|
||||||
|
(catch Exception _
|
||||||
|
{:total 0.0 :tax 0.0}))))
|
||||||
|
|
||||||
|
(defn query-deduped [entity-type start-date end-date]
|
||||||
|
"Query records deduplicated by external-id (latest _seq_no wins)."
|
||||||
|
(let [q (parquet-query entity-type start-date end-date)]
|
||||||
|
(query-rows
|
||||||
|
(str (:sql q)
|
||||||
|
" QUALIFY ROW_NUMBER() OVER"
|
||||||
|
" (PARTITION BY sales_order.external_id"
|
||||||
|
" ORDER BY _seq_no DESC) = 1"))))
|
||||||
|
|
||||||
|
(defn query-by-entity-id [entity-type external-id
|
||||||
|
start-date end-date]
|
||||||
|
(->> (query-deduped entity-type start-date end-date)
|
||||||
|
(filter #(= (:external_id %)
|
||||||
|
(name external-id)))
|
||||||
|
first))
|
||||||
|
|
||||||
|
(defn count-records-in-parquet
|
||||||
|
[entity-type start-date end-date]
|
||||||
|
(let [q (parquet-query entity-type
|
||||||
|
start-date end-date)]
|
||||||
|
(or (int (query-scalar (:count-sql q))) 0)))
|
||||||
184
src/clj/auto_ap/storage/sales_summaries.clj
Normal file
184
src/clj/auto_ap/storage/sales_summaries.clj
Normal file
@@ -0,0 +1,184 @@
|
|||||||
|
(ns auto-ap.storage.sales-summaries
|
||||||
|
"Aggregation functions querying Parquet files on S3 via DuckDB.
|
||||||
|
Entity types: sales-order | charge | line-item | sales-refund
|
||||||
|
S3 pattern: s3://<bucket>/sales-details/<entity-type>/<YYYY-MM-DD>.parquet"
|
||||||
|
(:require [auto-ap.storage.parquet :as p]
|
||||||
|
[clojure.string :as str]))
|
||||||
|
|
||||||
|
(defn- dq [name]
|
||||||
|
(str "\"" name "\""))
|
||||||
|
|
||||||
|
(defn- sum-dbl [val]
|
||||||
|
(try
|
||||||
|
(if val (double val) 0.0)
|
||||||
|
(catch Exception _e
|
||||||
|
0.0)))
|
||||||
|
|
||||||
|
(defn- pq-files [entity-type start-date end-date]
|
||||||
|
"Vector of S3 parquet file paths for date range."
|
||||||
|
(let [dates (p/date-seq start-date end-date)]
|
||||||
|
(vec
|
||||||
|
(map #(str "'s3://" p/*bucket*
|
||||||
|
"/sales-details/" entity-type "/"
|
||||||
|
% ".parquet") dates))))
|
||||||
|
|
||||||
|
(defn sum-payments-by-type [client-id start-date end-date]
|
||||||
|
"Return {processor-key -> {type-name-string -> total-double}}."
|
||||||
|
(let [files (pq-files "charge" start-date end-date)]
|
||||||
|
(try
|
||||||
|
(let [sql (str "SELECT "
|
||||||
|
(dq "processor")
|
||||||
|
" AS proc, "
|
||||||
|
(dq "type-name")
|
||||||
|
" AS type_name, "
|
||||||
|
"SUM("
|
||||||
|
(dq "total")
|
||||||
|
")::DOUBLE AS total_amount "
|
||||||
|
"FROM read_parquet(["
|
||||||
|
(str/join ", " files)
|
||||||
|
"]) "
|
||||||
|
"WHERE "
|
||||||
|
(dq "client-code")
|
||||||
|
" = '" client-id "' "
|
||||||
|
"GROUP BY "
|
||||||
|
(dq "processor") ", "
|
||||||
|
(dq "type-name"))]
|
||||||
|
(let [rows (p/query-rows sql)]
|
||||||
|
(reduce (fn [acc row]
|
||||||
|
(let [proc (:proc row)
|
||||||
|
tname (str/trim (name (:type_name row)))
|
||||||
|
total (sum-dbl (:total_amount row))]
|
||||||
|
(update acc proc
|
||||||
|
(fn [inner]
|
||||||
|
(let [b (or inner {})]
|
||||||
|
(assoc b
|
||||||
|
tname
|
||||||
|
(+ (get b tname 0.0) total)))))))
|
||||||
|
{}
|
||||||
|
rows)))
|
||||||
|
(catch Exception e
|
||||||
|
(println "[sales-summaries]" (.getMessage e))
|
||||||
|
{}))))
|
||||||
|
|
||||||
|
(defn sum-discounts [client-id start-date end-date]
|
||||||
|
(let [files (pq-files "sales-order" start-date end-date)]
|
||||||
|
(try
|
||||||
|
(let [sql (str "SELECT SUM("
|
||||||
|
(dq "discount")
|
||||||
|
")::DOUBLE AS discount_total "
|
||||||
|
"FROM read_parquet(["
|
||||||
|
(str/join ", " files)
|
||||||
|
"]) "
|
||||||
|
"WHERE "
|
||||||
|
(dq "client-code")
|
||||||
|
" = '" client-id "'")]
|
||||||
|
(or (some-> (first (p/query-rows sql)) :discount_total sum-dbl) 0.0))
|
||||||
|
(catch Exception e
|
||||||
|
(println "[sales-summaries/discounts]" (.getMessage e))
|
||||||
|
0.0))))
|
||||||
|
|
||||||
|
(defn sum-refunds-by-type [client-id start-date end-date]
|
||||||
|
(let [files (pq-files "sales-refund" start-date end-date)]
|
||||||
|
(try
|
||||||
|
(let [sql (str "SELECT "
|
||||||
|
(dq "type-name")
|
||||||
|
" AS type_name, "
|
||||||
|
"SUM("
|
||||||
|
(dq "total")
|
||||||
|
")::DOUBLE AS total_amount "
|
||||||
|
"FROM read_parquet(["
|
||||||
|
(str/join ", " files)
|
||||||
|
"]) "
|
||||||
|
"WHERE "
|
||||||
|
(dq "sales-order-external-id")
|
||||||
|
" IN (SELECT "
|
||||||
|
(dq "external-id")
|
||||||
|
" FROM read_parquet(["
|
||||||
|
(str/join ", " (pq-files "sales-order" start-date end-date))
|
||||||
|
"]) WHERE "
|
||||||
|
(dq "client-code")
|
||||||
|
" = '" client-id "') "
|
||||||
|
"GROUP BY " (dq "type-name"))]
|
||||||
|
(let [rows (p/query-rows sql)]
|
||||||
|
(reduce (fn [acc row]
|
||||||
|
(let [tname (str/trim (name (:type_name row)))
|
||||||
|
total (sum-dbl (:total_amount row))]
|
||||||
|
(assoc acc tname (+ (get acc tname 0.0) total))))
|
||||||
|
{}
|
||||||
|
rows)))
|
||||||
|
(catch Exception e
|
||||||
|
(println "[sales-summaries/refunds]" (.getMessage e))
|
||||||
|
{}))))
|
||||||
|
|
||||||
|
(defn sum-taxes [client-id start-date end-date]
|
||||||
|
(let [files (pq-files "sales-order" start-date end-date)]
|
||||||
|
(try
|
||||||
|
(let [sql (str "SELECT SUM("
|
||||||
|
(dq "tax")
|
||||||
|
")::DOUBLE AS tax_total "
|
||||||
|
"FROM read_parquet(["
|
||||||
|
(str/join ", " files)
|
||||||
|
"]) "
|
||||||
|
"WHERE "
|
||||||
|
(dq "client-code")
|
||||||
|
" = '" client-id "'")]
|
||||||
|
(or (some-> (first (p/query-rows sql)) :tax_total sum-dbl) 0.0))
|
||||||
|
(catch Exception e
|
||||||
|
(println "[sales-summaries/tax]" (.getMessage e))
|
||||||
|
0.0))))
|
||||||
|
|
||||||
|
(defn sum-tips [client-id start-date end-date]
|
||||||
|
(let [files (pq-files "sales-order" start-date end-date)]
|
||||||
|
(try
|
||||||
|
(let [sql (str "SELECT SUM("
|
||||||
|
(dq "tip")
|
||||||
|
")::DOUBLE AS tip_total "
|
||||||
|
"FROM read_parquet(["
|
||||||
|
(str/join ", " files)
|
||||||
|
"]) "
|
||||||
|
"WHERE "
|
||||||
|
(dq "client-code")
|
||||||
|
" = '" client-id "'")]
|
||||||
|
(or (some-> (first (p/query-rows sql)) :tip_total sum-dbl) 0.0))
|
||||||
|
(catch Exception e
|
||||||
|
(println "[sales-summaries/tip]" (.getMessage e))
|
||||||
|
0.0))))
|
||||||
|
|
||||||
|
(defn sum-sales-by-category [client-id start-date end-date]
|
||||||
|
(let [files (pq-files "line-item" start-date end-date)]
|
||||||
|
(try
|
||||||
|
(let [sql (str "SELECT "
|
||||||
|
(dq "category")
|
||||||
|
" AS category, "
|
||||||
|
"SUM("
|
||||||
|
(dq "total")
|
||||||
|
")::DOUBLE AS total_amount, "
|
||||||
|
"SUM("
|
||||||
|
(dq "tax")
|
||||||
|
")::DOUBLE AS tax_amount, "
|
||||||
|
"SUM("
|
||||||
|
(dq "discount")
|
||||||
|
")::DOUBLE AS discount_amount "
|
||||||
|
"FROM read_parquet(["
|
||||||
|
(str/join ", " files)
|
||||||
|
"]) "
|
||||||
|
"WHERE "
|
||||||
|
(dq "sales-order-external-id")
|
||||||
|
" IN (SELECT "
|
||||||
|
(dq "external-id")
|
||||||
|
" FROM read_parquet(["
|
||||||
|
(str/join ", " (pq-files "sales-order" start-date end-date))
|
||||||
|
"]) WHERE "
|
||||||
|
(dq "client-code")
|
||||||
|
" = '" client-id "') "
|
||||||
|
"GROUP BY " (dq "category"))]
|
||||||
|
(let [rows (p/query-rows sql)]
|
||||||
|
(mapv (fn [row]
|
||||||
|
{:category (or (:category row) "Unknown")
|
||||||
|
:total (sum-dbl (:total_amount row))
|
||||||
|
:tax (sum-dbl (:tax_amount row))
|
||||||
|
:discount (sum-dbl (:discount_amount row))})
|
||||||
|
rows)))
|
||||||
|
(catch Exception e
|
||||||
|
(println "[sales-summaries/sales]" (.getMessage e))
|
||||||
|
[]))))
|
||||||
30
test/clj/auto_ap/storage/parquet_test.clj
Normal file
30
test/clj/auto_ap/storage/parquet_test.clj
Normal file
@@ -0,0 +1,30 @@
|
|||||||
|
(ns auto-ap.storage.parquet-test
|
||||||
|
(:require [auto-ap.storage.parquet :as p]
|
||||||
|
[clojure.test :refer [deftest is testing use-fixtures]]))
|
||||||
|
|
||||||
|
(deftest test-query-scalar
|
||||||
|
(testing "SELECT 1 returns 1"
|
||||||
|
(is (= 1 (p/query-scalar "SELECT 1")))))
|
||||||
|
|
||||||
|
(deftest test-query-scalar-with-expression
|
||||||
|
(testing "SELECT 2 + 2 returns 4"
|
||||||
|
(is (= 4 (p/query-scalar "SELECT 2 + 2")))))
|
||||||
|
|
||||||
|
(deftest test-buffer
|
||||||
|
(testing "buffer! adds record to buffer"
|
||||||
|
(p/clear-buffer! "test-type")
|
||||||
|
(p/buffer! "test-type" {:id 1 :name "test"})
|
||||||
|
(is (= 1 (p/buffer-count "test-type")))))
|
||||||
|
|
||||||
|
(deftest test-clear-buffer
|
||||||
|
(testing "clear-buffer! empties buffer"
|
||||||
|
(p/clear-buffer! "test-type")
|
||||||
|
(p/buffer! "test-type" {:id 2})
|
||||||
|
(is (= 1 (p/buffer-count "test-type")))
|
||||||
|
(p/clear-buffer! "test-type")
|
||||||
|
(is (= 0 (p/buffer-count "test-type")))))
|
||||||
|
|
||||||
|
(deftest test-date-seq
|
||||||
|
(testing "date-seq generates correct sequence"
|
||||||
|
(let [result (p/date-seq "2024-04-01" "2024-04-03")]
|
||||||
|
(is (= ["2024-04-01" "2024-04-02" "2024-04-03"] result)))))
|
||||||
112
test/clj/auto_ap/storage/perf_test.clj
Normal file
112
test/clj/auto_ap/storage/perf_test.clj
Normal file
@@ -0,0 +1,112 @@
|
|||||||
|
(ns auto-ap.storage.perf-test
|
||||||
|
(:require [auto-ap.storage.parquet :as p]
|
||||||
|
[amazonica.aws.s3 :as s3]
|
||||||
|
[clojure.java.io :as io]
|
||||||
|
[clojure.string :as str])
|
||||||
|
(:import (java.sql DriverManager)
|
||||||
|
(java.time Instant)))
|
||||||
|
|
||||||
|
(defn timestamp []
|
||||||
|
(System/currentTimeMillis))
|
||||||
|
|
||||||
|
(defn timed [label sql-fn]
|
||||||
|
(let [start (timestamp)
|
||||||
|
result (sql-fn)
|
||||||
|
elapsed (- (timestamp) start)]
|
||||||
|
(println (format "%s: %d ms" label elapsed))
|
||||||
|
result))
|
||||||
|
|
||||||
|
(defn run-perf-tests []
|
||||||
|
(p/connect!)
|
||||||
|
(try
|
||||||
|
(let [bucket "data.dev.app.integreatconsult.com"
|
||||||
|
prefix "test-duckdb"
|
||||||
|
local-parquet "/tmp/test_data.parquet"
|
||||||
|
s3-key (str prefix "/data.parquet")]
|
||||||
|
|
||||||
|
;; Create 100k test rows
|
||||||
|
(println "\n=== Creating 100k test rows ===")
|
||||||
|
(p/execute! "DROP TABLE IF EXISTS test_data")
|
||||||
|
(p/execute! (str "
|
||||||
|
CREATE TABLE test_data AS
|
||||||
|
SELECT
|
||||||
|
i AS id,
|
||||||
|
'order_' || i AS external_id,
|
||||||
|
CASE (i % 5)
|
||||||
|
WHEN 0 THEN 'north'
|
||||||
|
WHEN 1 THEN 'south'
|
||||||
|
WHEN 2 THEN 'east'
|
||||||
|
WHEN 3 THEN 'west'
|
||||||
|
ELSE 'central'
|
||||||
|
END AS region,
|
||||||
|
CASE (i % 8)
|
||||||
|
WHEN 0 THEN 'food'
|
||||||
|
WHEN 1 THEN 'beverage'
|
||||||
|
WHEN 2 THEN 'alcohol'
|
||||||
|
WHEN 3 THEN 'catering'
|
||||||
|
WHEN 4 THEN 'retail'
|
||||||
|
WHEN 5 THEN 'dessert'
|
||||||
|
WHEN 6 THEN 'merch'
|
||||||
|
ELSE 'other'
|
||||||
|
END AS category,
|
||||||
|
ROUND(1 + ABS(RANDOM() % 10000) / 100.0, 2) AS amount,
|
||||||
|
CAST(DATE '2024-01-01' + (i % 365) * INTERVAL '1 day' AS DATE) AS sale_date,
|
||||||
|
CASE WHEN i % 20 = 0 THEN 'voided' ELSE 'active' END AS status
|
||||||
|
FROM generate_series(1, 100000) AS t(i)"))
|
||||||
|
(println "Row count:" (p/query-scalar "SELECT COUNT(*) FROM test_data"))
|
||||||
|
(println "Voided count:" (p/query-scalar "SELECT COUNT(*) FROM test_data WHERE status = 'voided'"))
|
||||||
|
(println "Amount > 3 count:" (p/query-scalar "SELECT COUNT(*) FROM test_data WHERE amount > 3"))
|
||||||
|
|
||||||
|
;; Write to local parquet
|
||||||
|
(println "\n=== Writing local parquet ===")
|
||||||
|
(timed "Write parquet" #(p/execute-to-parquet! "SELECT * FROM test_data" local-parquet))
|
||||||
|
(let [f (io/file local-parquet)]
|
||||||
|
(println "File size:" (format "%.1f MB" (/ (.length f) 1048576.0))))
|
||||||
|
|
||||||
|
;; Upload to S3
|
||||||
|
(println "\n=== Uploading to S3 ===")
|
||||||
|
(timed "S3 upload" #(p/upload-parquet! (io/file local-parquet) prefix))
|
||||||
|
(println "S3 URI:" (p/s3-location s3-key))
|
||||||
|
|
||||||
|
;; Now test reading from S3
|
||||||
|
(println "\n=== Performance Tests (reading from S3) ===")
|
||||||
|
(let [s3-uri (str "s3://" bucket "/" s3-key)]
|
||||||
|
|
||||||
|
;; Register S3 parquet as a view/table in DuckDB
|
||||||
|
(p/execute! (format "CREATE VIEW s3_test AS SELECT * FROM read_parquet('%s')" s3-uri))
|
||||||
|
(println "Total rows in S3:" (p/query-scalar "SELECT COUNT(*) FROM s3_test"))
|
||||||
|
|
||||||
|
;; Test 1: Page 1 - first 25 rows
|
||||||
|
(println "\n--- Test 1: Page 1 (LIMIT 25 OFFSET 0) ---")
|
||||||
|
(timed "First page (25 rows)" #(p/query-rows "SELECT * FROM s3_test ORDER BY id LIMIT 25"))
|
||||||
|
(println "Sample row:" (first (p/query-rows "SELECT * FROM s3_test ORDER BY id LIMIT 1")))
|
||||||
|
|
||||||
|
;; Test 2: Page 20 - rows 475-500 (OFFSET 475)
|
||||||
|
(println "\n--- Test 2: Page 20 (LIMIT 25 OFFSET 475) ---")
|
||||||
|
(timed "Page 20 (25 rows)" #(p/query-rows "SELECT * FROM s3_test ORDER BY id LIMIT 25 OFFSET 475"))
|
||||||
|
|
||||||
|
;; Test 3: Filter amount > 3 (no pagination)
|
||||||
|
(println "\n--- Test 3: Filter amount > 3 (no limit) ---")
|
||||||
|
(timed "Filter amount > 3 (all)" #(do (p/query-scalar "SELECT COUNT(*) FROM s3_test WHERE amount > 3") :done))
|
||||||
|
|
||||||
|
;; Test 4: Filter + pagination
|
||||||
|
(println "\n--- Test 4: Filter amount > 3 + LIMIT 25 ---")
|
||||||
|
(timed "Filter + paginated (25 rows)" #(p/query-rows "SELECT * FROM s3_test WHERE amount > 3 ORDER BY id LIMIT 25"))
|
||||||
|
|
||||||
|
;; Test 5: Filter + page 20
|
||||||
|
(println "\n--- Test 5: Filter amount > 3 + LIMIT 25 OFFSET 475 ---")
|
||||||
|
(timed "Filter + page 20" #(p/query-rows "SELECT * FROM s3_test WHERE amount > 3 ORDER BY id LIMIT 25 OFFSET 475"))
|
||||||
|
|
||||||
|
;; Test 6: Aggregation on S3 data
|
||||||
|
(println "\n--- Test 6: Aggregation (SUM, AVG on amount) ---")
|
||||||
|
(timed "Aggregation SUM/AVG" #(p/query-scalar "SELECT SUM(amount), AVG(amount) FROM s3_test WHERE status = 'active'"))
|
||||||
|
|
||||||
|
;; Cleanup
|
||||||
|
(p/execute! "DROP VIEW IF EXISTS s3_test")
|
||||||
|
(p/execute! "DROP TABLE IF EXISTS test_data"))
|
||||||
|
|
||||||
|
(finally
|
||||||
|
(p/disconnect!))))
|
||||||
|
|
||||||
|
(run-perf-tests)
|
||||||
|
(println "\n=== Done ===")
|
||||||
Reference in New Issue
Block a user