mirror of
https://github.com/penpot/penpot.git
synced 2026-05-09 01:58:46 +00:00
* 🐛 Fix Plugin API token application for JS array of strings Plugin code calling `shape.applyToken(token, ["fill"])` or `token.applyToShapes([rect], ["fill"])` from JavaScript supplies a JS array of strings. The plugin proxies expected a Clojure set of keywords, and two coupled defects made the calls silently no-op (or, with `throwValidationErrors` enabled, throw "check error"): 1. `token-attr-plugin->token-attr` only consulted its alias map when the input was already a keyword. String inputs like "fill" fell through to the identity branch, so the downstream `cto/token-attr?` predicate (which checks against a set of keywords) returned false for every string. Coerce strings to keywords first. 2. The `applyToken` / `applyToShapes` / `applyToSelected` schemas used plain `[:set ...]`, which has no `:decode/json` transformer for JS array → Clojure set coercion. Switch to the registered `[::sm/set ...]` (in `app.common.schema`) which provides the array → set decoder. After the switch, the standard JSON pipeline converts `["fill"]` to `#{"fill"}`, then the inner `[:and ::sm/keyword [:fn token-attr?]]` decodes each element to a keyword and validates it. Also extends the docstring on `token-attr-plugin->token-attr` to make the string-friendly contract explicit, and registers a new `tokens-test` ns under `frontend/test/frontend_tests/plugins/` with six `deftest` blocks covering: - known keywords passing through unchanged - keyword aliases (`:r1` → `:border-radius-top-left`, etc.) - string inputs coerced to keywords (regression for #9162) - `token-attr?` accepting both keyword and string inputs - `token-attr?` rejecting unknown attrs and nil Closes #9162 * 🐛 Fix wrong direction in plugin-name alias tests The added tests in tokens_test.cljs and the new docstring in tokens.cljs described the alias resolution in the wrong direction. The map is {:r1 :border-radius-top-left, …} then map-invert'd, so token-attr-plugin->token-attr maps verbose plugin-side names (:border-radius-top-left) to canonical internal short names (:r1), not the other way around. Inputs already in canonical form (:r1, :fill, "fill", …) pass through unchanged. Flipped the alias-resolution test expectations and the keyword/string-input cases, refreshed the docstring and the regression-coverage comment to match. --------- Co-authored-by: Andrey Antukh <niwi@niwi.nz>
83 lines
4.2 KiB
Clojure
83 lines
4.2 KiB
Clojure
;; This Source Code Form is subject to the terms of the Mozilla Public
|
|
;; License, v. 2.0. If a copy of the MPL was not distributed with this
|
|
;; file, You can obtain one at http://mozilla.org/MPL/2.0/.
|
|
;;
|
|
;; Copyright (c) KALEIDOS INC
|
|
|
|
(ns frontend-tests.plugins.tokens-test
|
|
(:require
|
|
[app.plugins.tokens :as ptok]
|
|
[cljs.test :as t :include-macros true]))
|
|
|
|
;; Regression coverage for issue #9162.
|
|
;;
|
|
;; Plugin code calling `shape.applyToken(token, ["fill"])` or
|
|
;; `token.applyToShapes([rect], ["fill"])` from JavaScript supplies a JS
|
|
;; array of strings. Penpot's plugin proxies expect a Clojure set of
|
|
;; keywords. Two coupled defects made these calls silently no-op (or, with
|
|
;; `throwValidationErrors` enabled, throw a "check error"):
|
|
;;
|
|
;; 1. `token-attr-plugin->token-attr` only consulted its alias map when
|
|
;; the input was already a keyword — string inputs like "fill" or
|
|
;; "border-radius-top-left" fell through to the identity branch
|
|
;; unchanged, so the downstream `cto/token-attr?` predicate (which
|
|
;; checks against a set of keywords) returned false.
|
|
;; 2. The `applyToken` / `applyToShapes` / `applyToSelected` schemas used
|
|
;; plain `[:set ...]`, which does not have a `:decode/json`
|
|
;; transformer for the JS array → Clojure set coercion. Penpot's
|
|
;; custom `[::sm/set ...]` does. Switching to the registered set type
|
|
;; lets the standard JSON decoder pipeline turn the JS argument into
|
|
;; a set of strings, after which the `[:and ::sm/keyword [:fn
|
|
;; token-attr?]]` element schema coerces each string to a keyword and
|
|
;; validates it.
|
|
;;
|
|
;; These helper-level tests pin the string-friendly conversion contract;
|
|
;; the schema-level fix is covered by the existing plugin integration
|
|
;; suite that exercises `applyToken` end-to-end.
|
|
|
|
(t/deftest token-attr-plugin->token-attr-passes-canonical-form-through
|
|
;; Both already-canonical short names and unaliased names pass through
|
|
;; unchanged.
|
|
(t/is (= :fill (ptok/token-attr-plugin->token-attr :fill)))
|
|
(t/is (= :stroke-color (ptok/token-attr-plugin->token-attr :stroke-color)))
|
|
(t/is (= :r1 (ptok/token-attr-plugin->token-attr :r1)))
|
|
(t/is (= :p2 (ptok/token-attr-plugin->token-attr :p2))))
|
|
|
|
(t/deftest token-attr-plugin->token-attr-resolves-verbose-plugin-aliases
|
|
;; Plugin-side verbose names (e.g. `:border-radius-top-left`) map to
|
|
;; their canonical short internal form (`:r1`) so plugin authors can
|
|
;; spell the corner explicitly without the engine having to know both.
|
|
(t/is (= :r1 (ptok/token-attr-plugin->token-attr :border-radius-top-left)))
|
|
(t/is (= :r2 (ptok/token-attr-plugin->token-attr :border-radius-top-right)))
|
|
(t/is (= :r3 (ptok/token-attr-plugin->token-attr :border-radius-bottom-right)))
|
|
(t/is (= :r4 (ptok/token-attr-plugin->token-attr :border-radius-bottom-left)))
|
|
(t/is (= :p1 (ptok/token-attr-plugin->token-attr :padding-top-left)))
|
|
(t/is (= :m3 (ptok/token-attr-plugin->token-attr :margin-bottom-right))))
|
|
|
|
(t/deftest token-attr-plugin->token-attr-coerces-string-input
|
|
;; This is the actual regression — JS plugin calls supply strings.
|
|
(t/is (= :fill (ptok/token-attr-plugin->token-attr "fill")))
|
|
(t/is (= :stroke-color (ptok/token-attr-plugin->token-attr "stroke-color")))
|
|
;; Verbose plugin aliases work via the string path too.
|
|
(t/is (= :r1 (ptok/token-attr-plugin->token-attr "border-radius-top-left")))
|
|
(t/is (= :m3 (ptok/token-attr-plugin->token-attr "margin-bottom-right"))))
|
|
|
|
(t/deftest token-attr?-accepts-keyword-input
|
|
(t/is (true? (boolean (ptok/token-attr? :fill))))
|
|
(t/is (true? (boolean (ptok/token-attr? :stroke-color))))
|
|
(t/is (true? (boolean (ptok/token-attr? :r1))))
|
|
(t/is (true? (boolean (ptok/token-attr? :p2)))))
|
|
|
|
(t/deftest token-attr?-accepts-string-input
|
|
;; Same JS-array-of-strings reproducer as the issue, exercised at the
|
|
;; predicate layer the plugin schemas call into.
|
|
(t/is (true? (boolean (ptok/token-attr? "fill"))))
|
|
(t/is (true? (boolean (ptok/token-attr? "stroke-color"))))
|
|
(t/is (true? (boolean (ptok/token-attr? "r1"))))
|
|
(t/is (true? (boolean (ptok/token-attr? "m3")))))
|
|
|
|
(t/deftest token-attr?-rejects-unknown-input
|
|
(t/is (false? (boolean (ptok/token-attr? :not-a-real-attr))))
|
|
(t/is (false? (boolean (ptok/token-attr? "not-a-real-attr"))))
|
|
(t/is (false? (boolean (ptok/token-attr? nil)))))
|