-
-
Notifications
You must be signed in to change notification settings - Fork 362
Spark names #1532
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Spark names #1532
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
♻️ Duplicate comments (3)
src/rpc/blockchain.cpp (3)
244-249
:⚠️ Potential issueValidate spark name existence before retrieving data.
The function doesn't check if the provided spark name exists before trying to get its data, which could lead to undefined behavior.
std::string sparkName = request.params[0].get_str(); CSparkNameManager *sparkNameManager = CSparkNameManager::GetInstance(); + // Check if the spark name exists + if (!sparkNameManager->SparkNameExists(sparkName)) { + throw JSONRPCError(RPC_INVALID_PARAMETER, "Spark name does not exist"); + } std::string SparkAddr; sparkNameManager->GetSparkAddress(sparkName, SparkAddr);
250-252
:⚠️ Potential issueRemove unused network variable.
The
network
variable is defined but never used in the function.UniValue result(UniValue::VOBJ); - unsigned char network = spark::GetNetworkType();
247-248
:⚠️ Potential issueAdd error handling for GetSparkAddress failure.
The function should check the return value of
GetSparkAddress
and handle the case where it fails to retrieve an address.std::string SparkAddr; - sparkNameManager->GetSparkAddress(sparkName, SparkAddr); + bool addressFound = sparkNameManager->GetSparkAddress(sparkName, SparkAddr); + if (!addressFound) { + throw JSONRPCError(RPC_INTERNAL_ERROR, "Failed to retrieve address for spark name"); + }
🧹 Nitpick comments (2)
src/rpc/blockchain.cpp (2)
219-236
: Parameter description in help text needs clarification.The help text doesn't follow the standard format for argument descriptions. The parameter description should include the parameter number, name, type, and whether it's required.
"\nArguments:\n" - "Spark name (string)\n" + "1. \"sparkname\" (string, required) The spark name to query\n"
253-260
: Consider handling possible nulls in additional data.The function should handle the possibility of null or missing additional data for the spark name.
result.push_back(Pair("address", SparkAddr)); uint64_t nameBlockHeight = sparkNameManager->GetSparkNameBlockHeight(sparkName); result.push_back(Pair("validUntil", nameBlockHeight)); std::string sparkNameData = sparkNameManager->GetSparkNameAdditionalData(sparkName); + // Handle case where additional data might be empty + if (sparkNameData.empty()) { + sparkNameData = ""; + } result.push_back(Pair("additionalInfo", sparkNameData));
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
src/rpc/blockchain.cpp
(3 hunks)
⏰ Context from checks skipped due to timeout of 90000ms (1)
- GitHub Check: Create Source Distribution
🔇 Additional comments (3)
src/rpc/blockchain.cpp (3)
28-28
: Include path looks good.This include provides access to the new
CSparkNameManager
class needed for Spark name operations.
180-216
: RPC command implementation looks solid.The
getsparknames
function correctly:
- Validates parameters
- Checks if Spark functionality is activated
- Retrieves and formats Spark names with their addresses
Good job on the thread safety with the
LOCK(cs_main)
and proper error handling when Spark isn't activated.
1746-1747
: RPC command registration looks good.The new RPC commands are properly registered in the command table.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (1)
src/qt/addresstablemodel.cpp (1)
38-43
: Consider uncommenting or removing the AddressType enum.The enum is currently commented out but is still being modified. If it's being used elsewhere, it should be uncommented. If it's deprecated, consider removing it entirely to avoid confusion.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (6)
src/qt/addressbookpage.cpp
(9 hunks)src/qt/addressbookpage.h
(2 hunks)src/qt/addresstablemodel.cpp
(12 hunks)src/rpc/blockchain.cpp
(1 hunks)src/rpc/client.cpp
(1 hunks)src/wallet/rpcwallet.cpp
(3 hunks)
✅ Files skipped from review due to trivial changes (1)
- src/rpc/blockchain.cpp
🚧 Files skipped from review as they are similar to previous changes (3)
- src/qt/addressbookpage.h
- src/rpc/client.cpp
- src/qt/addressbookpage.cpp
🧰 Additional context used
🧬 Code Definitions (2)
src/qt/addresstablemodel.cpp (1)
src/qt/addresstablemodel.h (4) (4)
address
(82-82)address
(87-87)address
(112-112)pubCoin
(113-113)
src/wallet/rpcwallet.cpp (5)
src/wallet/wallet.h (1) (1)
sparkNameData
(1105-1109)src/rpc/server.h (4) (4)
HelpExampleRpc
(202-202)name
(156-156)name
(157-157)name
(179-179)src/rpc/server.cpp (4) (4)
HelpExampleRpc
(592-596)HelpExampleRpc
(592-592)name
(367-373)name
(367-367)src/spark/state.h (3) (3)
IsSparkAllowed
(45-45)IsSparkAllowed
(46-46)GetNetworkType
(47-47)src/sparkname.h (10) (10)
name
(110-110)name
(119-119)name
(156-156)name
(157-157)name
(161-161)name
(163-163)sparkName
(170-170)GetNetworkType
(12-12)sparkNameData
(147-147)sparkNameData
(150-150)
⏰ Context from checks skipped due to timeout of 90000ms (1)
- GitHub Check: Create Source Distribution
🔇 Additional comments (17)
src/wallet/rpcwallet.cpp (5)
3812-3823
: Enhancement tospendspark
function to support spark names.The modified code now checks if the name starts with '@' to interpret it as a spark name rather than a direct address. If so, it calls
GetSparkAddress
from the SparkNameManager to retrieve the corresponding spark address. This allows users to use human-readable aliases instead of complex spark addresses.
3911-3958
: New RPC method:getsparknames
implemented correctly.This function retrieves and lists all registered spark names, with an optional parameter to filter and show only names that belong to the wallet. The implementation properly validates that Spark is activated before proceeding and correctly formats the response with name and address pairs.
3960-4004
: New RPC method:getsparknamedata
implemented properly.This method retrieves detailed information about a specific spark name including its associated address, validity period (block height), and any additional metadata. The implementation checks for Spark activation and handles the data retrieval appropriately.
4006-4084
: New RPC method:registersparkname
implemented with proper validation.This function handles the registration of new spark names. It includes appropriate validation:
- Ensures wallet is unlocked and Spark is activated
- Validates name length (≤ 20 characters)
- Verifies the spark address format
- Calculates fee based on name length according to consensus parameters
- Creates and commits a transaction for registration
The implementation follows good error handling practices by providing descriptive error messages for various failure scenarios.
5956-5958
: Correctly registered new RPC methods in the command table.The three new spark name functions have been properly added to the RPC command table, which ensures they'll be available through the RPC interface.
src/qt/addresstablemodel.cpp (12)
27-27
: Adding SparkName constant looks good.This constant will be used to identify Spark name entries in the address table, which is consistent with the other address type constants.
50-54
: Good practice: initializing members and updating constructor.Adding the default initialization of
isMine
tofalse
and updating the constructor to include the_isMine
parameter is a good practice for maintaining object state consistency.
97-114
: Well-structured implementation for tracking pending changes.The
PendingSparkNameChange
struct and related methods for tracking changes are well-implemented:
- The struct properly encapsulates the change type and spark name data
- Thread safety is ensured with the critical section lock
- The methods follow a consistent pattern of appending changes to a pending list
This approach aligns with how other pending changes are handled in the codebase.
117-127
: Proper signal connection handling in constructor/destructor.The constructor connects to the spark name signals and the destructor disconnects them. This is important to prevent memory leaks and ensure proper cleanup when the object is destroyed.
129-145
: Well-implemented method for refreshing spark names.The
refreshSparkNames()
method:
- Retrieves spark names from the
CSparkNameManager
- Properly handles the spark address and name formatting
- Checks if the address is owned by the wallet using
IsSparkAddressMine
- Adds entries to the cache with the correct type and ownership information
This follows the same pattern as other address loading methods in the codebase.
151-152
: Proper lock for thread safety.Adding the
LOCK(cs_main)
is important for thread safety when accessing theCSparkNameManager
, which likely depends on blockchain state.
200-200
: Good integration of spark names refresh into the address table refresh.Calling
refreshSparkNames()
withinrefreshAddressTable()
ensures that spark names are loaded alongside other address types when the table is refreshed.
248-249
: Consistent spark name identification.This code consistently uses the "@" prefix to identify spark names, matching the approach used in the new address entry method.
317-342
: Well-structured method for processing pending spark name changes.The
processPendingSparkNameChanges
method:
- Safely extracts pending changes using proper locking
- Checks if addresses already exist in the model to determine the update type
- Updates entries with appropriate parameters including ownership status
- Follows the existing patterns for updating the address table
This implementation ensures that spark name changes are properly processed and reflected in the UI.
401-404
: Good UI display representation for spark names.The code properly differentiates between "own spark name" and "spark name" based on whether the address belongs to the wallet. This helps users identify their own spark names versus others in the UI.
783-786
: Public method provides good API access for processing changes.The
ProcessPendingSparkNameChanges
public method exposes the functionality to process pending spark name changes, which can be called from other parts of the application when necessary.
235-236
:❓ Verification inconclusive
Using label prefix to determine address type might be fragile.
The code determines if an address is a spark name by checking if the label starts with "@". This approach relies on a naming convention that could potentially be bypassed if labels are manually edited.
Consider whether there's a more robust way to identify spark names that doesn't depend on the label format.
🏁 Script executed:
#!/bin/bash # Check if there are any other methods to identify spark names in the codebase rg -A 2 -B 2 "sparkName" --glob "*.{cpp,h}"Length of output: 50403
Spark Name Identification – Verify Robustness
The code in question uses the pattern of checking whether the label starts with "@" to determine if an address is a spark name. Our investigation shows that this approach is used consistently throughout the codebase (for example, in sendcoinsdialog.cpp, createsparknamepage.cpp, and walletmodel.cpp). This suggests that the "@" prefix is an intentional convention for distinguishing spark names.
However, as noted, relying solely on the label format can be fragile—especially if users are allowed to manually edit labels. It might be worthwhile to consider reinforcing this logic by either:
- Storing an explicit flag (or property) in the address entry to denote spark names, or
- Cross-checking against the spark name registry (via
CSparkNameManager
), so that even if the label is modified, the address type can still be determined reliably.Please verify whether the current trade-off is acceptable for your project or if implementing one of these alternatives would improve robustness.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (3)
src/chainparams.cpp (3)
173-179
: Add explanatory comments to clarify the fee structure.The new
standardSparkNamesFee
array defines the fees for spark names, but lacks documentation about what each value represents. Based on the PR objectives, I understand shorter names cost more, but the meaning of the -1 value and the exact mapping between array indices and name lengths isn't clear.Consider adding a comment like:
// Array defines the fee in FIRO for spark names based on character length // Index 0: Special value -1 (not usable) // Index 1: Fee for 1-character names (1000 FIRO) // Index 2: Fee for 2-character names (100 FIRO) // Indices 3-5: Fee for 3-5 character names (10 FIRO) // Indices 6-20: Fee for 6-20 character names (1 FIRO)
493-494
: Consider using a constant or macro for future dates.The comment indicates that this block height corresponds to approximately May 13th, 2025. For future maintenance, it might be helpful to define this as a named constant with the date in the comment to make it clear that this is a planned future activation.
173-179
: Verify fixed fee array approach against alternative implementations.The fixed size array approach for fees works but limits flexibility if fee structures need to change in the future. For a feature that may evolve, consider if a more dynamic approach would be beneficial:
- A function that calculates fees based on length
- A map/dictionary structure that could be updated
- Configurable fee parameters via RPC commands
However, the current implementation is simple and effective if the fee structure is expected to remain stable.
Also applies to: 493-494, 799-800, 1047-1048, 1291-1292
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
src/chainparams.cpp
(5 hunks)
🧰 Additional context used
🧬 Code Definitions (1)
src/chainparams.cpp (1)
src/chainparams.h (1)
consensus
(60-60)
⏰ Context from checks skipped due to timeout of 90000ms (1)
- GitHub Check: Create Source Distribution
🔇 Additional comments (1)
src/chainparams.cpp (1)
799-800
: Ensure consistent activation strategy across networks.I notice the Spark Names feature is activated at different relative points in each network:
- Mainnet: Block 1096000 (future - May 2025)
- Testnet: Block 174000
- Devnet: Block 3500
- Regtest: Block 2000
This appears to be an intentional staged rollout strategy, with the feature being available first on test networks before the mainnet. This is a good approach for testing new functionality.
Also applies to: 1047-1048, 1291-1292
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
♻️ Duplicate comments (1)
src/wallet/wallet.cpp (1)
5880-5894
: 🛠️ Refactor suggestionImplementation looks good but missing null check for sparkWallet
This new method properly handles wallet lock status and ensures the mint wallet is available before delegating to the spark wallet implementation. However, it doesn't check if
sparkWallet
is null before dereferencing it on line 5893, which is inconsistent with the fix applied in the second hunk.Apply this fix to prevent potential null pointer dereference:
CWalletTx CWallet::CreateSparkNameTransaction( CSparkNameTxData &sparkNameData, CAmount sparkNameFee, CAmount &txFee, const CCoinControl *coinControl) { // sanity check EnsureMintWalletAvailable(); if (IsLocked()) { throw std::runtime_error(_("Wallet locked")); } + if (!sparkWallet) { + throw std::runtime_error(_("Spark wallet not available")); + } return sparkWallet->CreateSparkNameTransaction(sparkNameData, sparkNameFee, txFee, coinControl); }
🧹 Nitpick comments (4)
src/spark/state.cpp (1)
245-247
: Variable naming consideration forfBackupRewrittenSparkNames
.
The introduction of this boolean flag is reasonable, but the name might be slightly ambiguous. If this flag is solely used to indicate spark name additions in the current block, a more direct name likefHasNewSparkNames
orfSparkNamesChanged
might be clearer.src/qt/walletmodel.cpp (3)
1562-1577
: Hardcoded sparkNameValidityBlocks value needs explanation.While the code correctly validates the spark name data, there's a hardcoded value of 1000 for
sparkNameValidityBlocks
with a comment saying "doesn't matter". This could be confusing for future developers. Consider adding a more detailed comment explaining why this value is being set but doesn't affect validation, or make it a named constant if it's a temporary placeholder.
1590-1600
: Consider adding logging for cases when spark name is not found.This function correctly retrieves the Spark address associated with a given Spark name. However, it silently returns an empty string when the name is not found. Consider adding debug logging for such cases to make troubleshooting easier.
QString WalletModel::getSparkNameAddress(const QString &sparkName) { LOCK(cs_main); CSparkNameManager *sparkNameManager = CSparkNameManager::GetInstance(); std::string sparkAddress; if (sparkNameManager->GetSparkAddress(sparkName.toStdString(), sparkAddress)) { return QString::fromStdString(sparkAddress); } else { + LogPrint(BCLog::SPARKNAME, "Spark name not found: %s\n", sparkName.toStdString()); return ""; } }
1602-1656
: Improve error message for Spark Name transactions.The function properly prepares a transaction for registering a spark name with appropriate error handling. However, the error message title "Spend Spark" may be confusing for errors related to spark name registration. Consider using a more specific title like "Register Spark Name".
catch (std::runtime_error const& e) { Q_EMIT message( - tr("Spend Spark"), + tr("Register Spark Name"), QString::fromStdString(e.what()), CClientUIInterface::MSG_ERROR); return TransactionCreationFailed; } catch (std::invalid_argument const& e) { Q_EMIT message( - tr("Spend Spark"), + tr("Register Spark Name"), QString::fromStdString(e.what()), CClientUIInterface::MSG_ERROR); return TransactionCreationFailed; }
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (15)
src/Makefile.am
(3 hunks)src/Makefile.qt.include
(4 hunks)src/Makefile.test.include
(2 hunks)src/qt/sendcoinsdialog.cpp
(1 hunks)src/qt/sendcoinsentry.cpp
(1 hunks)src/qt/walletmodel.cpp
(2 hunks)src/rpc/client.cpp
(1 hunks)src/spark/sparkwallet.cpp
(7 hunks)src/spark/sparkwallet.h
(3 hunks)src/spark/state.cpp
(7 hunks)src/spark/state.h
(2 hunks)src/txdb.cpp
(1 hunks)src/wallet/rpcwallet.cpp
(3 hunks)src/wallet/wallet.cpp
(2 hunks)src/wallet/wallet.h
(1 hunks)
🚧 Files skipped from review as they are similar to previous changes (11)
- src/Makefile.am
- src/rpc/client.cpp
- src/txdb.cpp
- src/qt/sendcoinsdialog.cpp
- src/qt/sendcoinsentry.cpp
- src/wallet/wallet.h
- src/Makefile.qt.include
- src/Makefile.test.include
- src/spark/state.h
- src/spark/sparkwallet.h
- src/spark/sparkwallet.cpp
🧰 Additional context used
🧬 Code Definitions (4)
src/qt/walletmodel.cpp (1)
src/qt/walletmodel.h (21)
wallet
(141-141)wallet
(368-368)address
(155-155)address
(156-156)address
(157-157)address
(273-273)address
(274-274)address
(275-275)address
(386-386)name
(203-203)sparkNameFee
(205-205)sparkName
(207-207)transaction
(176-176)transaction
(179-179)transaction
(196-198)transaction
(200-201)transaction
(209-213)transaction
(226-226)transaction
(229-229)coinControl
(143-143)SendCoinsReturn
(166-170)
src/spark/state.cpp (2)
src/sparkname.h (5)
nHeight
(165-165)IsInConflict
(124-128)IsInConflict
(124-124)IsInConflict
(131-144)IsInConflict
(131-131)src/sparkname.cpp (2)
ToUpper
(323-328)ToUpper
(323-323)
src/wallet/wallet.cpp (2)
src/spark/sparkwallet.cpp (5)
CreateSparkNameTransaction
(1615-1652)CreateSparkNameTransaction
(1615-1615)address
(191-191)address
(261-261)address
(1637-1637)src/wallet/wallet.h (15)
sparkNameData
(1108-1112)coinControl
(972-972)coinControl
(974-974)coinControl
(977-977)address
(1040-1040)address
(1075-1075)address
(1213-1213)address
(1214-1214)address
(1215-1215)address
(1216-1216)address
(1217-1217)address
(1401-1401)address
(1404-1404)address
(1425-1425)address
(1426-1426)
src/wallet/rpcwallet.cpp (5)
src/wallet/wallet.h (1)
sparkNameData
(1108-1112)src/rpc/server.h (5)
HelpExampleCli
(201-201)HelpExampleRpc
(202-202)name
(156-156)name
(157-157)name
(179-179)src/rpc/server.cpp (6)
HelpExampleCli
(591-594)HelpExampleCli
(591-591)HelpExampleRpc
(596-600)HelpExampleRpc
(596-596)name
(371-377)name
(371-371)src/spark/state.h (3)
IsSparkAllowed
(45-45)IsSparkAllowed
(46-46)GetNetworkType
(47-47)src/sparkname.h (10)
name
(110-110)name
(119-119)name
(156-156)name
(157-157)name
(161-161)name
(163-163)sparkName
(170-170)GetNetworkType
(12-12)sparkNameData
(147-147)sparkNameData
(150-150)
⏰ Context from checks skipped due to timeout of 90000ms (2)
- GitHub Check: Create Source Distribution
- GitHub Check: Create Source Distribution
🔇 Additional comments (14)
src/wallet/wallet.cpp (1)
8265-8265
: Important fix for potential null pointer dereferenceThis change adds a null check for
sparkWallet
before accessing its method, preventing a potential crash if the sparkWallet is not initialized. This is an important safety improvement.src/spark/state.cpp (6)
2-2
: Header inclusion looks good.
No issues with adding#include "sparkname.h"
here, as it is presumably needed to reference spark name-related functionality.
36-36
: Ensure consistent error handling forAddBlock
.
While callingCSparkNameManager::GetInstance()->AddBlock(blockIndex)
helps maintain spark name state when reconstru 6D47 cting the chain from the genesis block, confirm that the function either cannot fail or that its failures are inconsequential. If it can fail, consider logging or handling return/error states.
316-328
: Check for potential duplicate spark names in the same block.
Storing new spark names inpindexNew->addedSparkNames
ensures the block will retain them. However, there is no check here to see if the same name is added twice. Confirm that higher-level checks (e.g., inCheckSparkNameTx
or conflict checks) guarantee no duplicates. Otherwise, consider verifying or rejecting duplicates at this insertion point.
342-346
: Validate proper ordering of name cleanup and addition.
By callingRemoveSparkNamesLosingValidity
immediately beforeAddBlock
, you ensure expired names are removed prior to processing new ones. This ordering appears logical, but verify that partial failures in either step are handled or logged.
390-392
: Synchronization with spark name manager on block disconnection.
Removing spark names from the tip block duringDisconnectTipSpark
helps keep the blockchain and spark name manager state synchronized. This change is consistent with the approach taken inAddBlock
.
805-821
: Validate uppercase usage and conflict resolution for spark names.
Here, you confirm no conflicts exist before adding a spark name tosparkTxInfo->sparkNames
using uppercase keys. Ensure any name lookups in other parts of the code also use consistent uppercasing, preventing mismatch or silent duplication between differently cased names.src/wallet/rpcwallet.cpp (5)
3856-3865
: Well-implemented Spark name resolution in spendspark functionThe changes add support for using spark names by checking if the input starts with '@' and resolving it to a spark address. This improves usability by allowing users to send spark transactions using memorable names instead of lengthy addresses.
3953-4000
: Good implementation of getsparknames RPC methodThis function provides a well-structured way to list all spark names, with an optional filter to show only those belonging to the wallet. The error handling is comprehensive, and the results formatting is clear and consistent.
4002-4046
: Clear implementation of getsparknamedata RPC methodThis function correctly retrieves and formats detailed information about a specific spark name, including the associated address, validity period, and additional data. The error handling is appropriate, and the result format is intuitive.
4048-4126
: Well-structured registersparkname RPC implementationThe implementation handles all necessary steps for spark name registration:
- Validates wallet availability and unlocked status
- Ensures spark feature is activated
- Validates name format and registration duration
- Calculates appropriate fee based on name length
- Creates and commits the transaction
The error handling is comprehensive and provides clear messages. The fee calculation aligns with the PR objective of charging more for shorter names.
5998-6000
: Proper registration of new RPC methodsThe new spark name functions are correctly registered in the RPC command table, making them available through the API.
src/qt/walletmodel.cpp (2)
1326-1339
: Implementation for generating a new Spark address looks good.This function properly generates a new Spark address, adds it to the address book with the appropriate purpose ("receive"), and returns the encoded address as a QString. The wallet lock is correctly held during the operation.
1579-1588
: Transaction initialization for spark name registration looks good.This function correctly initializes a transaction for registering a spark name, with the recipient being the development fund address defined in the consensus parameters. The function properly sets the transaction amount to the provided spark name fee.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The only comment, if no objections please move sparkname.h/.cpp into spark directory
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
♻️ Duplicate comments (4)
src/sparkname.cpp (4)
12-12
: Consider using a thread-safe singleton initialization pattern.This global pointer instantiation is not guaranteed to be thread-safe. A recommended approach would be to use a Meyer's singleton or another thread-safe pattern.
🧰 Tools
🪛 Cppcheck (2.10-2)
[performance] 12-12: Function parameter 'label' should be passed by const reference.
(passedByValue)
162-163
: Validate blocks-per-year calculation.This identical calculation was flagged as potentially inaccurate in a prior review. Verify that multiplying 24×24 truly reflects the correct blocks-per-hour or blocks-per-day for this chain.
287-325
: Add a maximum iteration limit to avoid a potential infinite loop.As previously noted, this loop lacks a failsafe. If a valid hash cannot be found, the code will continue indefinitely. Consider imposing a maximum number of attempts (e.g., 1000) to detect errors gracefully.
🧰 Tools
🪛 Cppcheck (2.10-2)
[performance] 311-311: Variable 'type' is assigned in constructor body. Consider performing initialization in initialization list.
(useInitializationList)
[performance] 312-312: Variable 'hashBytes' is assigned in constructor body. Consider performing initialization in initialization list.
(useInitializationList)
382-390
: Reconcile allowed characters with documented standards.This block permits periods ('.'), whereas the earlier documentation or comments suggested only alphanumeric and hyphens. Update the comment or remove the '.' to align code and documentation.
🧹 Nitpick comments (1)
src/sparkname.cpp (1)
14-49
: Double-check concurrency handling in AddBlock/RemoveBlock.These methods modify shared data structures (
sparkNames
andsparkNameAddresses
) without any evident locking or thread-safety mechanism. Ensure this code is only called in single-threaded contexts or protected by a proper synchronization strategy to avoid data races.🧰 Tools
🪛 Cppcheck (2.10-2)
[performance] 27-27: Variable 'txid' is assigned in constructor body. Consider performing initialization in initialization list.
(useInitializationList)
[performance] 23-23: Variable 'prevhash' is assigned in constructor body. Consider performing initialization in initialization list.
(useInitializationList)
[performance] 44-44: Variable 'type' is assigned in constructor body. Consider performing initialization in initialization list.
(useInitializationList)
[performance] 45-45: Variable 'addressBytes' is assigned in constructor body. Consider performing initialization in initialization list.
(useInitializationList)
[performance] 46-46: Variable 'txhash' is assigned in constructor body. Consider performing initialization in initialization list.
(useInitializationList)
[performance] 37-37: Function parameter 'additional_data' should be passed by const reference.
(passedByValue)
[performance] 38-38: Function parameter 'associated_data' should be passed by const reference.
(passedByValue)
[performance] 22-22: Prefer prefix ++/-- operators for non-primitive types.
(postfixOperator)
[performance] 34-34: Prefer prefix ++/-- operators for non-primitive types.
(postfixOperator)
[performance] 37-37: Prefer prefix ++/-- operators for non-primitive types.
(postfixOperator)
[performance] 45-45: Prefer prefix ++/-- operators for non-primitive types.
(postfixOperator)
[performance] 48-48: Prefer prefix ++/-- operators for non-primitive types.
(postfixOperator)
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
src/sparkname.cpp
(1 hunks)
🧰 Additional context used
🧬 Code Graph Analysis (1)
src/sparkname.cpp (1)
src/sparkname.h (14)
pindex
(167-167)pindex
(168-168)name
(110-110)name
(119-119)name
(156-156)name
(157-157)name
(161-161)name
(163-163)tx
(105-105)tx
(107-107)sparkNameData
(147-147)sparkNameData
(150-150)nHeight
(165-165)txSparkSpend
(153-153)
🪛 Cppcheck (2.10-2)
src/sparkname.cpp
[performance] 377-377: Variable 'vHave' is assigned in constructor body. Consider performing initialization in initialization list.
(useInitializationList)
[performance] 51-51: Variable 'hash' is assigned in constructor body. Consider performing initialization in initialization list.
(useInitializationList)
[performance] 27-27: Variable 'txid' is assigned in constructor body. Consider performing initialization in initialization list.
(useInitializationList)
[performance] 65-65: Variable 'txid' is assigned in constructor body. Consider performing initialization in initialization list.
(useInitializationList)
[performance] 69-69: Variable 'addressType' is assigned in constructor body. Consider performing initialization in initialization list.
(useInitializationList)
[performance] 70-70: Variable 'addressHash' is assigned in constructor body. Consider performing initialization in initialization list.
(useInitializationList)
[performance] 147-147: Variable 'blockHash' is assigned in constructor body. Consider performing initialization in initialization list.
(useInitializationList)
[performance] 182-182: Variable 'type' is assigned in constructor body. Consider performing initialization in initialization list.
(useInitializationList)
[performance] 183-183: Variable 'hashBytes' is assigned in constructor body. Consider performing initialization in initialization list.
(useInitializationList)
[performance] 184-184: Variable 'txhash' is assigned in constructor body. Consider performing initialization in initialization list.
(useInitializationList)
[performance] 216-216: Variable 'script' is assigned in constructor body. Consider performing initialization in initialization list.
(useInitializationList)
[performance] 270-270: Variable 'type' is assigned in constructor body. Consider performing initialization in initialization list.
(useInitializationList)
[performance] 271-271: Variable 'hashBytes' is assigned in constructor body. Consider performing initialization in initialization list.
(useInitializationList)
[performance] 274-274: Variable 'txhash' is assigned in constructor body. Consider performing initialization in initialization list.
(useInitializationList)
[performance] 311-311: Variable 'type' is assigned in constructor body. Consider performing initialization in initialization list.
(useInitializationList)
[performance] 312-312: Variable 'hashBytes' is assigned in constructor body. Consider performing initialization in initialization list.
(useInitializationList)
[performance] 344-344: Variable 'type' is assigned in constructor body. Consider performing initialization in initialization list.
(useInitializationList)
[performance] 345-345: Variable 'hashBytes' is assigned in constructor body. Consider performing initialization in initialization list.
(useInitializationList)
[performance] 23-23: Variable 'prevhash' is assigned in constructor body. Consider performing initialization in initialization list.
(useInitializationList)
[performance] 44-44: Variable 'type' is assigned in constructor body. Consider performing initialization in initialization list.
(useInitializationList)
[performance] 45-45: Variable 'addressBytes' is assigned in constructor body. Consider performing initialization in initialization list.
(useInitializationList)
[performance] 46-46: Variable 'txhash' is assigned in constructor body. Consider performing initialization in initialization list.
(useInitializationList)
[performance] 52-52: Variable 'type' is assigned in constructor body. Consider performing initialization in initialization list.
(useInitializationList)
[performance] 53-53: Variable 'addressBytes' is assigned in constructor body. Consider performing initialization in initialization list.
(useInitializationList)
[performance] 10-10: Function parameter 'label' should be passed by const reference.
(passedByValue)
[performance] 12-12: Function parameter 'label' should be passed by const reference.
(passedByValue)
[performance] 81-81: Function parameter 'label' should be passed by const reference.
(passedByValue)
[performance] 105-105: Function parameter 'data' should be passed by const reference.
(passedByValue)
[performance] 37-37: Function parameter 'additional_data' should be passed by const reference.
(passedByValue)
[performance] 38-38: Function parameter 'associated_data' should be passed by const reference.
(passedByValue)
[performance] 74-74: Function parameter 'hash_bind_inner' should be passed by const reference.
(passedByValue)
[performance] 75-75: Function parameter '_name' should be passed by const reference.
(passedByValue)
[performance] 75-75: Function parameter '_sparkAddress' should be passed by const reference.
(passedByValue)
[performance] 75-75: Function parameter '_additionalInfo' should be passed by const reference.
(passedByValue)
[error] 106-106: Signed integer overflow for expression '1<<31'.
(integerOverflow)
[performance] 22-22: Prefer prefix ++/-- operators for non-primitive types.
(postfixOperator)
[performance] 34-34: Prefer prefix ++/-- operators for non-primitive types.
(postfixOperator)
[performance] 37-37: Prefer prefix ++/-- operators for non-primitive types.
(postfixOperator)
[performance] 45-45: Prefer prefix ++/-- operators for non-primitive types.
(postfixOperator)
[performance] 48-48: Prefer prefix ++/-- operators for non-primitive types.
(postfixOperator)
[performance] 83-83: Prefer prefix ++/-- operators for non-primitive types.
(postfixOperator)
⏰ Context from checks skipped due to timeout of 90000ms (2)
- GitHub Check: Create Source Distribution
- GitHub Check: Create Source Distribution
🔇 Additional comments (1)
src/sparkname.cpp (1)
168-168
: Verify array indexing for name-based fees.
consensusParams.nSparkNamesFee[sparkNameData.name.size()]
could lead to an out-of-bounds index if the name length exceeds the fee array's length. Confirm that the array is sized to handle the maximum valid spark name length.
There was a problem hiding this comment.
Actionable comments posted: 0
🧹 Nitpick comments (2)
src/test/sparkname_tests.cpp (2)
37-46
: Avoid using BOOST_CHECK within utility methods
Utility functions likeIsSparkNamePresent
andGetSparkNameAdditionalData
embed test assertions. This can reduce clarity by mixing test logic with utility functionality. Consider returning booleans or relevant data, then performingBOOST_CHECK
calls within the main test body for better separation of concerns.
109-163
: Large method for modifying spark name transactions
ModifySparkNameTx
handles multiple duties: reading transaction data, modifying it, recalculating proofs, and re-injecting the payload. Consider splitting these responsibilities into smaller helper functions for enhanced clarity, maintainability, and reusability. Also verify the loop that incrementshashFailsafe
can never become an infinite loop under normal conditions.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
src/test/sparkname_tests.cpp
(1 hunks)
🧰 Additional context used
🧬 Code Graph Analysis (1)
src/test/sparkname_tests.cpp (6)
src/qt/addresstablemodel.cpp (1)
sparkNameManager
(129-145)src/libspark/keys.h (1)
network
(84-84)src/qt/walletmodel.h (1)
sparkNameFee
(205-205)src/chainparams.h (1)
consensus
(60-60)src/validation.h (1)
chainparams
(595-595)src/validation.cpp (1)
mempool
(127-127)
🪛 Cppcheck (2.10-2)
src/test/sparkname_tests.cpp
[error] 174-174: There is an unknown macro here somewhere. Configuration is required. If BOOST_FIXTURE_TEST_SUITE is a macro then please configure it.
(unknownMacro)
⏰ Context from checks skipped due to timeout of 90000ms (2)
- GitHub Check: Create Source Distribution
- GitHub Check: Create Source Distribution
🔇 Additional comments (5)
src/test/sparkname_tests.cpp (5)
19-20
: Potential side effect from global consensus references
The test class stores a mutable reference to global consensus parameters. Tests that alter global state in this manner can lead to hard-to-diagnose side effects or concurrency issues if tests run in parallel. Confirm that tests are either run serially or that these modifications are properly isolated to avoid unintended interference.
47-53
: Initialization strategy looks solid
TheInitialize
method robustly sets up the testing environment by generating blocks and mints before enabling full transaction broadcasts. This approach is clear, straightforward, and good for consistent test prep.
75-78
: Validate the annual fee calculation
The calculation of year blocks and the derived fee might have off-by-one issues in edge cases. Confirm the correctness of(sparkNameData.sparkNameValidityBlocks + 24 * 24 * 365 - 1) / (24 * 24 * 365)
so that it accurately reflects the intended annual periods without rounding anomalies.
100-107
: Block disconnection approach is consistent
Providing a dedicatedDisconnectAndInvalidate
method ensures thorough coverage for rollback scenarios. This method neatly wraps typical block invalidation steps, aiding in maintainability.
174-310
: Extensive test coverage for spark names
The test cases replicate numerous real-world scenarios: mempool conflicts, rollback, fee validation, ownership proofs, and hard-fork transitions. This suite provides broad coverage, helping ensure reliability of the spark name feature.🧰 Tools
🪛 Cppcheck (2.10-2)
[error] 174-174: There is an unknown macro here somewhere. Configuration is required. If BOOST_FIXTURE_TEST_SUITE is a macro then please configure it.
(unknownMacro)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
🧹 Nitpick comments (1)
src/wallet/rpcwallet.cpp (1)
4097-4098
: Consider documenting the block calculation for validity periodThe formula
numberOfYears * 365*24*24
assumes 24 blocks per hour (one block every 2.5 minutes). It would be beneficial to add a comment explaining this calculation or use constants to make the formula more clear.-sparkNameData.sparkNameValidityBlocks = numberOfYears * 365*24*24; +// Calculate validity period in blocks: years * (365 days) * (24 hours) * (24 blocks per hour) +const int BLOCKS_PER_HOUR = 24; // One block every 2.5 minutes +const int BLOCKS_PER_DAY = 24 * BLOCKS_PER_HOUR; +const int BLOCKS_PER_YEAR = 365 * BLOCKS_PER_DAY; +sparkNameData.sparkNameValidityBlocks = numberOfYears * BLOCKS_PER_YEAR;
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (4)
src/qt/createsparknamepage.cpp
(1 hunks)src/qt/walletmodel.cpp
(2 hunks)src/qt/walletmodel.h
(2 hunks)src/wallet/rpcwallet.cpp
(3 hunks)
🚧 Files skipped from review as they are similar to previous changes (3)
- src/qt/walletmodel.h
- src/qt/createsparknamepage.cpp
- src/qt/walletmodel.cpp
🧰 Additional context used
🧬 Code Graph Analysis (1)
src/wallet/rpcwallet.cpp (3)
src/wallet/wallet.h (1)
sparkNameData
(1108-1112)src/spark/state.h (3)
IsSparkAllowed
(45-45)IsSparkAllowed
(46-46)GetNetworkType
(47-47)src/sparkname.h (10)
name
(110-110)name
(119-119)name
(156-156)name
(157-157)name
(161-161)name
(163-163)sparkName
(170-170)GetNetworkType
(12-12)sparkNameData
(147-147)sparkNameData
(150-150)
⏰ Context from checks skipped due to timeout of 90000ms (6)
- GitHub Check: Build for macOS
- GitHub Check: Build for Linux
- GitHub Check: Build for Windows
- GitHub Check: Build for macOS
- GitHub Check: Build for Windows
- GitHub Check: Build for Linux
🔇 Additional comments (4)
src/wallet/rpcwallet.cpp (4)
3856-3865
: Good implementation for resolving spark names in spendsparkThe modification to the
spendspark
function allows users to reference spark addresses by their spark names (prefixed with '@'). This enhancement provides a cleaner user experience while maintaining backward compatibility.
3953-4000
: Well-structured implementation of getsparknamesThe
getsparknames
function provides a clean way to list all spark names, with an option to filter only those owned by the wallet. The implementation includes proper error handling and parameter validation.
4048-4098
: Good parameter validation in registersparknameThe
registersparkname
function thoroughly validates all input parameters before proceeding with the registration. The checks for wallet availability, spark activation, name length, and number of years are all appropriate.
6004-6006
: Proper registration of new RPC commandsThe RPC table registration for the new functions is correctly implemented, with appropriate access flags.
CSparkNameManager *sparkNameManager = CSparkNameManager::GetInstance(); | ||
std::string errorDescription; | ||
if (!sparkNameManager->ValidateSparkNameData(sparkNameData, errorDescription)) | ||
throw JSONRPCError(RPC_INVALID_PARAMETER, "Error creating spark name: "+ errorDescription); | ||
|
||
CAmount sparkNameFee = consensusParams.nSparkNamesFee[sparkName.size()]*COIN*numberOfYears; | ||
CAmount fee; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Verification agent
🧩 Analysis chain
Verify sparkName size exists in fee map
The code accesses consensusParams.nSparkNamesFee[sparkName.size()]
directly, which could lead to an out-of-bounds access if the size isn't in the map. Consider using a safer access method.
🏁 Script executed:
#!/bin/bash
# Check how nSparkNamesFee is defined and which sizes it covers
# First check for the declaration in consensus parameters
echo "Searching for nSparkNamesFee declaration..."
rg "nSparkNamesFee" --type cpp -A 10
Length of output: 6800
Action Required: Ensure Safe Access to nSparkNamesFee
The code in src/wallet/rpcwallet.cpp (lines 4099-4105) accesses consensusParams.nSparkNamesFee[sparkName.size()]
directly. Since nSparkNamesFee
is defined as a fixed-size std::array<int,21>
, there's a risk of an out-of-bounds access if sparkName.size()
is greater than or equal to 21.
Please add a bounds-check before accessing the fee array. For example:
-
Verify the sparkName size:
if (sparkName.size() >= consensusParams.nSparkNamesFee.size()) throw JSONRPCError(RPC_INVALID_PARAMETER, "Invalid spark name length; no fee defined for this size.");
-
Then safely access the fee:
CAmount sparkNameFee = consensusParams.nSparkNamesFee[sparkName.size()] * COIN * numberOfYears;
This change will prevent potential undefined behavior due to out-of-bound array access.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
♻️ Duplicate comments (4)
src/sparkname.h (1)
30-31
: Document the allowance for '.' if truly intended.Current documentation says "only alphanumeric characters and hyphens," but the code in
IsSparkNameValid
allows a '.' character. Update either the code or this comment to reflect the actual rules.src/sparkname.cpp (3)
12-12
: Implement a thread-safe singleton or Meyers' singleton pattern.Relying on a globally allocated pointer (
sharedSparkNameManager
) can introduce threading issues or initialization order hazards. Consider switching to a safer static instance approach:-CSparkNameManager *CSparkNameManager::sharedSparkNameManager = new CSparkNameManager(); +CSparkNameManager *CSparkNameManager::GetInstance() { + static CSparkNameManager instance; + return &instance; +}
382-390
: Synchronize name rules with the documented constraints.This code allows
'.'
while the comment in sparkname.h states only alphanumeric and hyphens. If'.'
is truly valid, update the doc string or remove the dot from the check.
287-325
: 🛠️ Refactor suggestionPrevent indefinite attempts in
AppendSparkNameTxData
.This loop increments
hashFailsafe
with no upper limit. In extremely pathological cases, it might never find a valid scalar. Introduce a maximum attempt threshold and throw an exception or bail out to avoid potential hung states.+const uint32_t MAX_HASH_FAILSAFE_ATTEMPTS = 1000; for (uint32_t n=0; ; n++) { + if (n >= MAX_HASH_FAILSAFE_ATTEMPTS) { + throw std::runtime_error("Giving up after too many hashFailsafe attempts"); + } ... }
🧹 Nitpick comments (6)
src/test/sparkname_tests.cpp (4)
16-36
: Ensure test fixture rollback steps are clearly documented.The constructor and destructor neatly set up and tear down the environment, including resetting the
sparkState
andsparkNameManager
. Consider adding in-code comments that clarify expected behavior when rolling back or reinitializing the test environment, so future maintainers easily grasp the intended lifecycle of objects here.
38-46
: Consider consolidating or reusing test utility functions.
IsSparkNamePresent
andGetSparkNameAdditionalData
both callsparkNameManager
and perform related lookups. If more Spark Name checks are added in the future, grouping these helper calls into a single utility or a local test helper (e.g., a struct with state) can keep test code consistent and reduce duplication.
48-67
: Add explicit assertions for generated addresses.
Initialize
andGenerateSparkAddress
create essential test prerequisites. However, there's no direct assertion verifying that the generated Spark address is valid before usage. Consider adding a check (e.g., non-empty address, matches expected format) to detect issues early in the test pipeline.
69-99
: Validate returned transaction object to prevent silent test failures.Both
CreateSparkNameTx
overloads produce a new transaction but defer final checks to the caller or test code. If any internal error arises (e.g., insufficient fees, address issues), an invalid transaction could be returned. Adding immediate test assertions or logs before returning the transaction (especially in the non-committed path) can help detect unexpected states.src/sparkname.h (1)
107-107
: Elaborate on validation failure outcomes.
CheckSparkNameTx(...)
can returntrue
in certain early-exit scenarios (e.g., below start block) or set an error viaCValidationState
. If an actual error is flagged, clarifying how it surfaces to callers can help avoid confusion about partial acceptance vs. silent rejections.src/sparkname.cpp (1)
62-68
: Enforce consistent name addition inDumpSparkNameData
.
DumpSparkNameData
simply collects all current spark names. If new conditions (e.g., “hidden” or partial names) arise in the future, consider filtering or logging them explicitly to avoid unexpected reveals or incomplete data. Currently, the open approach is fine.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (3)
src/sparkname.cpp
(1 hunks)src/sparkname.h
(1 hunks)src/test/sparkname_tests.cpp
(1 hunks)
🧰 Additional context used
🧬 Code Graph Analysis (3)
src/test/sparkname_tests.cpp (4)
src/sparkname.h (6)
name
(110-110)name
(119-119)name
(156-156)name
(157-157)name
(161-161)name
(163-163)src/chainparams.h (1)
consensus
(60-60)src/validation.h (1)
chainparams
(595-595)src/sparkname.cpp (4)
ss
(204-204)ss
(296-296)GetSparkNameAdditionalData
(92-99)GetSparkNameAdditionalData
(92-92)
src/sparkname.h (6)
src/test/sparkname_tests.cpp (10)
name
(38-41)name
(38-38)name
(43-46)name
(43-43)name
(91-99)name
(91-91)sparkAddress
(142-142)sparkNameData
(69-89)sparkNameData
(69-69)address
(58-58)src/qt/createsparknamepage.h (1)
name
(37-37)src/sparkname.cpp (4)
sparkAddress
(225-225)sparkAddress
(307-307)ToUpper
(327-332)ToUpper
(327-327)src/spark/sparkwallet.h (4)
address
(48-48)address
(65-65)address
(66-66)address
(67-67)src/spark/sparkwallet.cpp (5)
address
(191-191)address
(261-261)address
(1637-1637)getAddress
(252-257)getAddress
(252-252)src/wallet/rpcwallet.cpp (7)
address
(1233-1233)address
(3572-3572)address
(3696-3696)address
(3740-3740)address
(3909-3909)address
(4528-4528)address
(4652-4652)
src/sparkname.cpp (4)
src/sparkname.h (14)
pindex
(167-167)pindex
(168-168)name
(110-110)name
(119-119)name
(156-156)name
(157-157)name
(161-161)name
(163-163)tx
(105-105)tx
(107-107)sparkNameData
(147-147)sparkNameData
(150-150)nHeight
(165-165)txSparkSpend
(153-153)src/test/sparkname_tests.cpp (15)
name
(38-41)name
(38-38)name
(43-46)name
(43-43)name
(91-99)name
(91-91)address
(58-58)tx
(110-164)tx
(110-110)sparkTx
(112-112)sparkNameData
(69-89)sparkNameData
(69-69)params
(56-67)ss
(131-131)incomingViewKey
(146-146)src/spark/sparkwallet.h (5)
address
(48-48)address
(65-65)address
(66-66)address
(67-67)params
(39-39)src/libspark/keys.h (4)
m
(87-87)m
(88-91)m
(93-94)str
(85-85)
🪛 Cppcheck (2.10-2)
src/test/sparkname_tests.cpp
[error] 175-175: There is an unknown macro here somewhere. Configuration is required. If BOOST_FIXTURE_TEST_SUITE is a macro then please configure it.
(unknownMacro)
⏰ Context from checks skipped due to timeout of 90000ms (2)
- GitHub Check: Create Source Distribution
- GitHub Check: Create Source Distribution
🔇 Additional comments (2)
src/test/sparkname_tests.cpp (1)
175-311
: Excellent coverage of block reorg scenarios and mempool conflicts.The
BOOST_AUTO_TEST_CASE(general)
andBOOST_AUTO_TEST_CASE(hfblocknumber)
provide a thorough run-through of lifecycle events for spark names—creation, invalidation, reorg, and re-validation. These cases greatly enhance reliability.🧰 Tools
🪛 Cppcheck (2.10-2)
[error] 175-175: There is an unknown macro here somewhere. Configuration is required. If BOOST_FIXTURE_TEST_SUITE is a macro then please configure it.
(unknownMacro)
src/sparkname.cpp (1)
162-166
: Check usage of 24*24 for blocks per hour.
constexpr int nBlockPerYear = 365*24*24;
implies 576 blocks per day. Confirm that 2.5-minute block spacing is the intended design. If block times differ, or if an exact daily rate is crucial for fee calculations, revise to ensure correct block count per year.
void DisconnectAndInvalidate() { | ||
LOCK(cs_main); | ||
CBlockIndex *pindex = chainActive.Tip(); | ||
DisconnectBlocks(1); | ||
CValidationState state; | ||
const CChainParams &chainparams = ::Params(); | ||
InvalidateBlock(state, chainparams, pindex); | ||
} | ||
|
||
void ModifySparkNameTx(CMutableTransaction &tx, std::function<void(CSparkNameTxData &)> modify, bool fRecalcOwnershipProof = true) { | ||
const spark::Params *params = spark::Params::get_default(); | ||
spark::SpendTransaction sparkTx(params); | ||
|
||
CSparkNameTxData sparkNameData; | ||
size_t sparkNameDataPos; | ||
BOOST_CHECK(sparkNameManager->ParseSparkNameTxData(tx, sparkTx, sparkNameData, sparkNameDataPos)); | ||
|
||
modify(sparkNameData); | ||
|
||
if (fRecalcOwnershipProof) { | ||
for (uint32_t n=0; ; n++) { | ||
sparkNameData.addressOwnershipProof.clear(); | ||
sparkNameData.hashFailsafe = n; | ||
|
||
CMutableTransaction txCopy(tx); | ||
CDataStream serializedSparkNameData(SER_NETWORK, PROTOCOL_VERSION); | ||
serializedSparkNameData << sparkNameData; | ||
txCopy.vExtraPayload.erase(txCopy.vExtraPayload.begin() + sparkNameDataPos, txCopy.vExtraPayload.end()); | ||
txCopy.vExtraPayload.insert(txCopy.vExtraPayload.end(), serializedSparkNameData.begin(), serializedSparkNameData.end()); | ||
|
||
CHashWriter ss(SER_GETHASH, PROTOCOL_VERSION); | ||
ss << txCopy; | ||
|
||
spark::Scalar m; | ||
try { | ||
m.SetHex(ss.GetHash().ToString()); | ||
} | ||
catch (const std::exception &) { | ||
continue; // increase hashFailSafe and try again | ||
} | ||
|
||
spark::Address sparkAddress(spark::Params::get_default()); | ||
spark::OwnershipProof ownershipProof; | ||
|
||
spark::SpendKey spendKey = pwalletMain->sparkWallet->generateSpendKey(spark::Params::get_default()); | ||
spark::IncomingViewKey incomingViewKey(spendKey); | ||
sparkAddress.decode(sparkNameData.sparkAddress); | ||
sparkAddress.prove_own(m, spendKey, incomingViewKey, ownershipProof); | ||
|
||
CDataStream ownershipProofStream(SER_NETWORK, PROTOCOL_VERSION); | ||
ownershipProofStream << ownershipProof; | ||
|
||
sparkNameData.addressOwnershipProof.assign(ownershipProofStream.begin(), ownershipProofStream.end()); | ||
|
||
break; | ||
} | ||
} | ||
|
||
CDataStream serializedSpark(SER_NETWORK, PROTOCOL_VERSION); | ||
serializedSpark << sparkNameData; | ||
|
||
tx.vExtraPayload.erase(tx.vExtraPayload.begin() + sparkNameDataPos, tx.vExtraPayload.end()); | ||
tx.vExtraPayload.insert(tx.vExtraPayload.end(), serializedSpark.begin(), serializedSpark.end()); | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠 F438 Refactor suggestion
Consider bounding the ownership proof recalculations.
Within ModifySparkNameTx
, the loop that increments hashFailsafe
can theoretically continue in rare edge cases. Similar logic in other parts of the code (e.g., AppendSparkNameTxData
) has been flagged for potential infinite loops. Placing a hard cap on attempts avoids a worst-case scenario that never returns.
+ const uint32_t MAX_HASH_FAILSAFE_ATTEMPTS = 1000;
for (uint32_t n=0; ; n++) {
+ if (n >= MAX_HASH_FAILSAFE_ATTEMPTS) {
+ BOOST_FAIL("Failed to recalc ownership proof after max attempts");
+ return;
+ }
// ...
}
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
void DisconnectAndInvalidate() { | |
LOCK(cs_main); | |
CBlockIndex *pindex = chainActive.Tip(); | |
DisconnectBlocks(1); | |
CValidationState state; | |
const CChainParams &chainparams = ::Params(); | |
InvalidateBlock(state, chainparams, pindex); | |
} | |
void ModifySparkNameTx(CMutableTransaction &tx, std::function<void(CSparkNameTxData &)> modify, bool fRecalcOwnershipProof = true) { | |
const spark::Params *params = spark::Params::get_default(); | |
spark::SpendTransaction sparkTx(params); | |
CSparkNameTxData sparkNameData; | |
size_t sparkNameDataPos; | |
BOOST_CHECK(sparkNameManager->ParseSparkNameTxData(tx, sparkTx, sparkNameData, sparkNameDataPos)); | |
modify(sparkNameData); | |
if (fRecalcOwnershipProof) { | |
for (uint32_t n=0; ; n++) { | |
sparkNameData.addressOwnershipProof.clear(); | |
sparkNameData.hashFailsafe = n; | |
CMutableTransaction txCopy(tx); | |
CDataStream serializedSparkNameData(SER_NETWORK, PROTOCOL_VERSION); | |
serializedSparkNameData << sparkNameData; | |
txCopy.vExtraPayload.erase(txCopy.vExtraPayload.begin() + sparkNameDataPos, txCopy.vExtraPayload.end()); | |
txCopy.vExtraPayload.insert(txCopy.vExtraPayload.end(), serializedSparkNameData.begin(), serializedSparkNameData.end()); | |
CHashWriter ss(SER_GETHASH, PROTOCOL_VERSION); | |
ss << txCopy; | |
spark::Scalar m; | |
try { | |
m.SetHex(ss.GetHash().ToString()); | |
} | |
catch (const std::exception &) { | |
continue; // increase hashFailSafe and try again | |
} | |
spark::Address sparkAddress(spark::Params::get_default()); | |
spark::OwnershipProof ownershipProof; | |
spark::SpendKey spendKey = pwalletMain->sparkWallet->generateSpendKey(spark::Params::get_default()); | |
spark::IncomingViewKey incomingViewKey(spendKey); | |
sparkAddress.decode(sparkNameData.sparkAddress); | |
sparkAddress.prove_own(m, spendKey, incomingViewKey, ownershipProof); | |
CDataStream ownershipProofStream(SER_NETWORK, PROTOCOL_VERSION); | |
ownershipProofStream << ownershipProof; | |
sparkNameData.addressOwnershipProof.assign(ownershipProofStream.begin(), ownershipProofStream.end()); | |
break; | |
} | |
} | |
CDataStream serializedSpark(SER_NETWORK, PROTOCOL_VERSION); | |
serializedSpark << sparkNameData; | |
tx.vExtraPayload.erase(tx.vExtraPayload.begin() + sparkNameDataPos, tx.vExtraPayload.end()); | |
tx.vExtraPayload.insert(tx.vExtraPayload.end(), serializedSpark.begin(), serializedSpark.end()); | |
} | |
void ModifySparkNameTx(CMutableTransaction &tx, std::function<void(CSparkNameTxData &)> modify, bool fRecalcOwnershipProof = true) { | |
const spark::Params *params = spark::Params::get_default(); | |
spark::SpendTransaction sparkTx(params); | |
CSparkNameTxData sparkNameData; | |
size_t sparkNameDataPos; | |
BOOST_CHECK(sparkNameManager->ParseSparkNameTxData(tx, sparkTx, sparkNameData, sparkNameDataPos)); | |
modify(sparkNameData); | |
if (fRecalcOwnershipProof) { | |
const uint32_t MAX_HASH_FAILSAFE_ATTEMPTS = 1000; | |
for (uint32_t n = 0; ; n++) { | |
if (n >= MAX_HASH_FAILSAFE_ATTEMPTS) { | |
BOOST_FAIL("Failed to recalc ownership proof after max attempts"); | |
return; | |
} | |
sparkNameData.addressOwnershipProof.clear(); | |
sparkNameData.hashFailsafe = n; | |
CMutableTransaction txCopy(tx); | |
CDataStream serializedSparkNameData(SER_NETWORK, PROTOCOL_VERSION); | |
serializedSparkNameData << sparkNameData; | |
txCopy.vExtraPayload.erase(txCopy.vExtraPayload.begin() + sparkNameDataPos, txCopy.vExtraPayload.end()); | |
txCopy.vExtraPayload.insert(txCopy.vExtraPayload.end(), serializedSparkNameData.begin(), serializedSparkNameData.end()); | |
CHashWriter ss(SER_GETHASH, PROTOCOL_VERSION); | |
ss << txCopy; | |
spark::Scalar m; | |
try { | |
m.SetHex(ss.GetHash().ToString()); | |
} | |
catch (const std::exception &) { | |
continue; // increase hashFailSafe and try again | |
} | |
spark::Address sparkAddress(spark::Params::get_default()); | |
spark::OwnershipProof ownershipProof; | |
spark::SpendKey spendKey = pwalletMain->sparkWallet->generateSpendKey(spark::Params::get_default()); | |
spark::IncomingViewKey incomingViewKey(spendKey); | |
sparkAddress.decode(sparkNameData.sparkAddress); | |
sparkAddress.prove_own(m, spendKey, incomingViewKey, ownershipProof); | |
CDataStream ownershipProofStream(SER_NETWORK, PROTOCOL_VERSION); | |
ownershipProofStream << ownershipProof; | |
sparkNameData.addressOwnershipProof.assign(ownershipProofStream.begin(), ownershipProofStream.end()); | |
break; | |
} | |
} | |
CDataStream serializedSpark(SER_NETWORK, PROTOCOL_VERSION); | |
serializedSpark << sparkNameData; | |
tx.vExtraPayload.erase(tx.vExtraPayload.begin() + sparkNameDataPos, tx.vExtraPayload.end()); | |
tx.vExtraPayload.insert(tx.vExtraPayload.end(), serializedSpark.begin(), serializedSpark.end()); | |
} |
// check the possibility to register a new spark name, return true if it's possible | ||
bool ValidateSparkNameData(const CSparkNameTxData &sparkNameData, std::string &errorDescription); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Validate spark name data once before building transactions.
ValidateSparkNameData(...)
is available, but it’s not clear how consistently it’s used before appending Spark Name data to a transaction. Ensuring a single entry point for name validations (including length, character check, address conflicts) can prevent corner-case bugs.
Rebase includes: * UI improvements (#1475) * Lelantus UI stripped * More improvements * Fixed UI hang issue during batch verification * QT hanging improvements * QT hanging fix * Review commits applied * coderabbit comments applied * more coderabbitai comments resolved --------- Co-authored-by: firstcryptoman <firstcryptoman@gmail.com> * Spark names (#1532) * Initial spark name architecture * Spark address ownership proofs implemented. * Missing files added * Check the ownership proof for spark names: initial implementation * Fixes to the core part of spark names * Added additional field (core) * Consensus parameters for spark names * Fixed mempool bug * Fixes in spark name conflict resolution * RPCs for spark names * Additional API for spark names tx creation * Changed way of checking spark name tx * Wallet API for spark name transaction creation * API changes for spark name tx creation * Added registersparkname RPC call * Spark activation check for RPC * Make spark names case-insensitive * Spark name RPC fixes * Faster lookup for spark name by spark address * Fixes for spark name/address lookup * Improvements for duplicated address detection * Fixes for spark name state * Block index entries for spark names * Make dot (.) a legit symbol in spark name * Spark name block number for testnet * Fixed restoration of old spark name data if block is disconnected * API for quick check of spark name transaction validity before the creation * added isAddressMine function * Check if the address belongs to the wallet before creating spark name transaction * Fixed fee calculation for spark name * Fix for spark names RPC * Added ability to spend to spark names in "sparkspend" RPC * UI fixes * Additional validations * Fix for crash with spend to unregistered name * Fixed fee value when registering spark name for more than one year * Spark name UI improvements * UI modifications for sending to spark name * Address book fixes related to spark names * Fixed period of validity when creating spark name through GUI * Extended spark name error info for RPC * Fixed crash on non-HD wallet * Request wallet password for spark name creation * Fixed fee calculation for the spark name tx * Fixed detection of previously used spark address for a spark name * Unit test for spark names * Additional unit tests * Fixes #1533 * getsparknamedata RPC now returns JSON object * Added "My own spark names" to the dropdown list in address book * Added an option of displaying only own spark names for RPC. Closes #1535 * Set HF block for spark names * Fixed a check for spark name block start * Added tests for correctly respecting HF block number * Check if we're over HF before spark name transaction creation * new rpc for spark name (#1552) * Fixed spark name tests * Changed HF date * Change of HF block number --------- Co-authored-by: levonpetrosyan93 <petrosyan.levon93@gmail.com> Co-authored-by: levoncrypto <levoncrypto1994@gmail.com> Co-authored-by: levoncrypto <95240473+levoncrypto@users.noreply.github.com> Co-authored-by: levonpetrosyan93 <45027856+levonpetrosyan93@users.noreply.github.com> * Export View Keys (#1543) * Add an RPC command to export the Spark view key. * Show Spark View Key in Qt. * Sigma pool closed, Extra payload extended (#1477) * Change of emission rules * Fixes for testnet * Cleaning up code and tests * Workaround for current devnet bugs * Workaround for testnet * Devnet parameter tweak * Sigma pool closed * Extra payload size limit increased * Changed HF block for testnet * Initial spark name architecture * Spark address ownership proofs implemented. * Missing files added * Check the ownership proof for spark names: initial implementation * Fixes to the core part of spark names * Added additional field (core) * Consensus parameters for spark names * Fixed mempool bug * Fixes in spark name conflict resolution * RPCs for spark names * Additional API for spark names tx creation * Changed way of checking spark name tx * Wallet API for spark name transaction creation * API changes for spark name tx creation * Added registersparkname RPC call * Spark activation check for RPC * Make spark names case-insensitive * Spark name RPC fixes * Faster lookup for spark name by spark address * Fixes for spark name/address lookup * Improvements for duplicated address detection * Fixes for spark name state * Block index entries for spark names * Make dot (.) a legit symbol in spark name * Spark name block number for testnet * Fixed restoration of old spark name data if block is disconnected * API for quick check of spark name transaction validity before the creation * added isAddressMine function * Check if the address belongs to the wallet before creating spark name transaction * Fixed fee calculation for spark name * Fix for spark names RPC * Added ability to spend to spark names in "sparkspend" RPC * UI fixes * Additional validations * Fix for crash with spend to unregistered name * Fixed fee value when registering spark name for more than one year * Spark name UI improvements * UI modifications for sending to spark name * Address book fixes related to spark names * Fixed period of validity when creating spark name through GUI * Extended spark name error info for RPC * Fixed crash on non-HD wallet * Request wallet password for spark name creation * Fixed fee calculation for the spark name tx * Fixed detection of previously used spark address for a spark name * Unit test for spark names * Additional unit tests * Fixes #1533 * Testnet HF block set * Mainnet HF block set --------- Co-authored-by: Peter Shugalev <peter@shugalev.com> Co-authored-by: levoncrypto <levoncrypto1994@gmail.com> * Build fix (#1553) * Build fix * coderabbitai comment resolved * Duplicated rpc removed * Bump version to v0.14.14.1 Spark Names (#1550) * secp256k1: CMake build system added * cmake: add cmake folder * bench: Add initial cmake support * Add initial main CMakeLists.txt * doc: add initial cmake support * univalue: add initial cmake support * zmq: add initial cmake support * crypto: add initial cmake support * wallet: add initial cmake support * src: initial add of src/CMakeLists.txt * toolchain.cmake.in: Adding toolchain.cmake.in support * crypto: add support for CMake function check. * bitcoin-cli: add CMake compilation. * firo-tx: add CMake compilation. Improve miscellaneous CMake files. * firod: add CMake compilation support. * Improved misc CMake compilation support. * Add bitcoin-config.h generation support. * gmp: fix gmp link on OS X * build: generate Linux installable package. * CMake: add tests (test_bitcoin) compilation support. * Qt: CMake GUI compilation support (Ubuntu) * src/qt: Add macOS support for GUI with CMake compilation. * depends: fix gmp compilation issue with mingw * build: Add MingW support for CMake build system * src: add backtrace compilation support macOS * src: add backtrace compilation support Linux and MinGW-w64 * CMake: apply CodeRabbitAI suggestions. * CMake: Add CI tasks for CMake build. * Remove static from functions to fix debug linking In release builds, these functions are inlined, avoiding linkage issues. Debug builds (without inlining) caused undefined references in dependent libraries due to static visibility. Removing static resolves this while maintaining intended behavior in optimized builds. * Add removed testcases and make BUILD_GUI=ON default option * Added documentation to readme.md and fixed a lelantus test issue * Fixed documentation, set ENABLE_WALLET=ON by default Remove outdated old sigma testcase * Rebase to Wed Apr 23 11:39:34 AM BST 2025 --------- Co-authored-by: levonpetrosyan93 <45027856+levonpetrosyan93@users.noreply.github.com> Co-authored-by: firstcryptoman <firstcryptoman@gmail.com> Co-authored-by: psolstice <peter@shugalev.com> Co-authored-by: levonpetrosyan93 <petrosyan.levon93@gmail.com> Co-authored-by: levoncrypto <levoncrypto1994@gmail.com> Co-authored-by: levoncrypto <95240473+levoncrypto@users.noreply.github.com> Co-authored-by: cassandras-lies <203535133+cassandras-lies@users.noreply.github.com> Co-authored-by: justanwar <42809091+justanwar@users.noreply.github.com>
PR intention
Spark name is a way to create an easily remembered alias to a spark address. This PR adds both core and UI functionality for spark names creation and management. Fee is required to register a spark name, and it depends on spark name length (the shorter the length, the more expensive spark name registration is).
Code changes brief
Core functionality is added on top of spark spend transaction. When spark is spent to development fund address spark name metadata is expected to follow.