-
Notifications
You must be signed in to change notification settings - Fork 368
feat(store): new OpenAPI-based client store #5529
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
|
#9780 Bundle Size — 2.58MiB (+5.31%).332a7fc(current) vs ab0d3a3 main#9775(baseline) Important Bundle introduced 2 duplicate packages – View changed duplicate packages Bundle metrics
Bundle analysis report Branch feat/store Project dashboard Generated by RelativeCI Documentation Report issue |
Cloudflare Preview for the API Client |
new store whooooooooooooooooooooooooo |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Didn't go too deep but glanced over and left some comments. Pretty sweet!. We should iron a couple things out.
we should have tests for full reactivity for live sync, handle the whole spec + some conflict resolutionalready works apparently 🥳- figure out how we want to do additional data like auth
- figure out if we want to support overlays
- explore how it would look with client lite + openapi blocks
- lets do indexdb from the beginning instead of local storage, so we don't hit the same limitations
|
||
// Both paths access the same data | ||
const person = collection.document.components.schemas.Person | ||
const user = collection.document.components.schemas.User |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Super cool
const user = collection.document.components.schemas.User | ||
|
||
// Changes through either path update the source | ||
user.properties.age = { type: 'number' } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
okay so we probably don't want to do this, make it immutable and use mutators instead
}, | ||
}) | ||
|
||
// Vue reactivity works through references |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
love it!
}) | ||
|
||
// Prefix temporary properties with _ and they won’t be exported. | ||
collection.document.components.schemas.Person._selected = true |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We probably don't want to allow just any private properties. For the auth stuff for sure OR we just use an overlay to store that kind of info. Open to discussions on this one
localStoragePlugin(), | ||
|
||
// Or specify a custom storage key (default: 'state') | ||
localStoragePlugin({ key: 'my-state' }) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yea we should do this regardless, our own apps might start conflicting too
if (isRef(content)) { | ||
workspace.state.collections[collectionId] = createCollection(content) | ||
} else { | ||
Object.assign(workspace.state.collections[collectionId], content) |
Check warning
Code scanning / CodeQL
Prototype-polluting assignment Medium
library input
Show autofix suggestion
Hide autofix suggestion
Copilot Autofix
AI 11 days ago
To fix the issue, we need to prevent prototype pollution by ensuring that collectionId
cannot be a dangerous value like __proto__
, constructor
, or prototype
. This can be achieved by validating collectionId
before using it as a key in workspace.state.collections
. If collectionId
matches any of these reserved property names, we should throw an error or handle it appropriately.
The best approach is to add a validation step at the beginning of the load
method to reject unsafe keys. This ensures that the rest of the method operates on safe input.
-
Copy modified lines R46-R50
@@ -45,2 +45,7 @@ | ||
) { | ||
// Validate collectionId to prevent prototype pollution | ||
if (['__proto__', 'constructor', 'prototype'].includes(collectionId)) { | ||
throw new Error(`Invalid collectionId: "${collectionId}" is not allowed.`); | ||
} | ||
|
||
const content = | ||
@@ -48,4 +53,2 @@ | ||
|
||
// TODO: Validate content | ||
|
||
if (typeof workspace.state.collections[collectionId] === 'undefined') { |
} | ||
} | ||
} else if (action.update && typeof target === 'object' && target !== null) { | ||
Object.assign(target, action.update) |
Check warning
Code scanning / CodeQL
Prototype-polluting assignment Medium
library input
Show autofix suggestion
Hide autofix suggestion
Copilot Autofix
AI 2 days ago
To fix the issue, we need to ensure that the Object.assign
operation on line 451 does not allow prototype-polluting keys like __proto__
, constructor
, or prototype
to be added to the target
object. This can be achieved by validating and filtering the keys in action.update
before applying them to target
. Specifically, we can use a utility function to sanitize the action.update
object by removing any prototype-polluting keys.
The fix involves:
- Adding a utility function
sanitizeObject
to filter out dangerous keys (__proto__
,constructor
,prototype
) from an object. - Using this utility function to sanitize
action.update
before callingObject.assign
.
-
Copy modified lines R415-R424 -
Copy modified lines R461-R462
@@ -414,2 +414,12 @@ | ||
*/ | ||
function sanitizeObject(obj: UnknownObject): UnknownObject { | ||
const dangerousKeys = ['__proto__', 'constructor', 'prototype']; | ||
return Object.keys(obj).reduce((sanitized, key) => { | ||
if (!dangerousKeys.includes(key)) { | ||
sanitized[key] = obj[key]; | ||
} | ||
return sanitized; | ||
}, {} as UnknownObject); | ||
} | ||
|
||
function applyOverlay(document: UnknownObject, overlay: UnknownObject) { | ||
@@ -450,3 +460,4 @@ | ||
} else if (action.update && typeof target === 'object' && target !== null) { | ||
Object.assign(target, action.update) | ||
const sanitizedUpdate = sanitizeObject(action.update); | ||
Object.assign(target, sanitizedUpdate); | ||
} |
target[key] = { ...sourceValue } | ||
} else { | ||
// For primitive values, just assign | ||
target[key] = sourceValue |
Check warning
Code scanning / CodeQL
Prototype-polluting function Medium
source
target
Show autofix suggestion
Hide autofix suggestion
Copilot Autofix
AI 2 days ago
To fix the issue, we will add safeguards to the mergeDocuments
function to prevent prototype pollution. Specifically:
- Block the special keys
__proto__
andconstructor
from being merged into thetarget
object. - Ensure that only own properties of the
source
object are processed during the merge.
This fix will involve:
- Adding a check to skip keys that are
__proto__
orconstructor
. - Using
Object.prototype.hasOwnProperty.call(source, key)
to ensure that only own properties ofsource
are considered.
-
Copy modified lines R593-R602
@@ -592,2 +592,12 @@ | ||
for (const key in source) { | ||
// Skip prototype-polluting keys | ||
if (key === "__proto__" || key === "constructor") { | ||
continue | ||
} | ||
|
||
// Ensure the key is an own property of the source object | ||
if (!Object.prototype.hasOwnProperty.call(source, key)) { | ||
continue | ||
} | ||
|
||
const sourceValue = source[key] |
target[key] = { ...sourceValue } | ||
} else { | ||
// For primitive values, just assign | ||
target[key] = sourceValue |
Check warning
Code scanning / CodeQL
Prototype-polluting function Medium
source
target
Show autofix suggestion
Hide autofix suggestion
Copilot Autofix
AI 2 days ago
To fix the issue, we will:
- Block the special keys
__proto__
andconstructor
from being copied fromsource
totarget
. This prevents prototype pollution. - Ensure that only own properties of the
source
object are copied to thetarget
object. This avoids copying inherited properties that could lead to unexpected behavior.
The changes will be made in the mergeDocuments
function, specifically in the loop that iterates over the keys of the source
object.
-
Copy modified lines R593-R597
@@ -592,2 +592,7 @@ | ||
for (const key in source) { | ||
// Skip inherited properties and prototype-polluting keys | ||
if (!Object.prototype.hasOwnProperty.call(source, key) || key === "__proto__" || key === "constructor") { | ||
continue | ||
} | ||
|
||
const sourceValue = source[key] |
Problem
We’re using a reactive data store (mostly in
@scalar/api-client
), which helped us to learn a lot. We hit a few limitations, though:Solution
With this PR we’re introducing a new, fully OpenAPI-based reactive data store. :) Here are the main ideas:
$ref
’s on the fly (not all at once)Usage
A whole workspace:
A workspace has a collection for every document. And this how a single collection works:
Performance
Tasks/Ideas
Not all are necessary to get the store in. We need to identify the critical ones.
@scalar/openapi-parser
dependency? it’s just there for upgrading@scalar/openapi-types
dependency? it’s just there for the Zod schema, which is too strict anywayRef
’sworkspace.create('myCollection', data)
(data can be a string, object, ref, function, (async) function )workspace.delete(myCollection)
workspace.update(myCollection, newValue)
helper/parse.ts
in parallelcomputed()
) on top of the data store: to retrieve all tags, with a default tag, with tag groups, tagsSorter …optional()
andcatch()
everywhere?createWorkspace({ readOnly: true })
any
, no@ts-expect-error
, no type-casting)From the feedback:
Not sure, but I think we said we have to (= it’s required) to:
Super early ideas:
createPartialCollection<Operation>({ summary: 'This is the same reactive collection, but just for a specific OpenAPI object' })
(helps with the transition? might be cool for Scalar Blocks?)Checklist
I’ve gone through the following:
pnpm changeset
).