Financial Advisory Blog

Armanino’s Financial Advisory blog is your source for thought leadership around cloud ERP and accounting solutions and integrations. Supported by the Cloud Accounting Institute and numerous experts in cloud, finance, reporting, integration, compliance, and technology, Armanino’s Financial Advisory blog features must-read content on what’s happening in the finance industry, case studies, white papers, and much more.

October 2, 2019

How the Right Accounting Tools Facilitate Digital Transformation

Posted by Narges Attaran

Icons integrating together to create one cohesive ecosystem.

A team from Armanino attended BlackLine’s annual InTheBlack Conference to present a session about how companies are achieving digital transformation. The attendees learned how we work with CFOs to revolutionize their accounting workflows using a combination of BlackLine and Sage Intacct software.

As an introduction to our presentation, we wanted to highlight some of the ways that the right accounting tools facilitate digital transformation at companies large and small. A 2018 Gartner survey of CEOs and senior executives showed that 62% of respondents currently have digital initiatives underway, yet many are still struggling to make technology the centerpiece of everyday operations. We have some suggestions:

Seamlessly Integrate Data

No technology offers a complete solution. Digital transformation requires companies to implement a variety of technologies and, crucially, integrate them all effectively. For example, Sage Intacct and Blackline are natural pairs for accounting because a specialized integration pack allows them to seamlessly share data. Thanks to this integration, both solutions include the most complete and current information available to ensure objective decision making.

Eliminate Manual Inputs

Asking accountants to manually input data and move it between systems is a waste of their time and talent. Worse, it creates a high risk of human errors compromising data quality. Using more technology shouldn’t require more work, which is why integrated solutions are ideal. Manual inputs become largely unnecessary once different components of enterprise IT can seamlessly share data.

Fill In “Last Mile” Functionality

Echoing the theme that all technology is incomplete, companies need to root out their missing features and integrations. Ultimately, these are the biggest obstacles to digital transformation because they lead to fractured (rather than seamless) capabilities. Instead of evaluating technologies in isolation, consider how they complement one another. For example, Sage Intacct is a powerful financial management solution, but the close process is lacking. BlackLine offers an extensive toolkit for close management with perfect data integration. Together, both solutions are complete.

Automate Routine Workflows

Companies expect digital transformation to make them more productive and efficient, not just more beholden to technology. For that to be the case, technology must be able to automate time- and labor-intensive workflows like the close process. Once technology is doing the lion’s share of the work, professionals can focus on more strategic efforts. In that way, digital transformation optimizes human talent.

Gain Enterprise-Wide Visibility

Sound financial decision making requires enterprise-wide visibility. Unfortunately, for reasons outlined above, data is often stuck in silos and excluded from important performance metrics. Decision makers can’t understand the true strengths or weaknesses of a company until they can monitor every aspect of it.

Fully-integrated technologies provide that visibility into all aspects of accounting. When data is always available broadly and deeply, decision makers can confidently avoid obstacles and engineer opportunities – exactly what digital transformation strives to do.

Armanino helps clients navigate digital transformation and achieve their end goals using a combination of next-generation technologies and expert change management strategies. Contact us to learn more.  

September 3, 2019

Revamp or Start Fresh? Planning Your Salesforce Implementation

Posted by Lu Bai

Technology Cloud Salesforce upgrade Image
Technology Cloud Image

As you plan a Salesforce upgrade such as adding Field Service Lightning or Service Cloud, you have to factor in a number of considerations to design the best approach. In some cases, you may decide that a total replacement makes more sense than upgrading a Salesforce organization that’s not meeting your company’s needs.

Some of the common problems that can inspire companies to think about upgrading or replacing their Salesforce org are issues with data integrity (including duplicate or manual processes), process automation, third-party add-ons, system integration, and insufficient or non-existent documentation.

While it’s tempting to launch an upgrade to address performance issues, it’s important to first invest the time to review the existing org thoroughly to understand what’s wrong, the best way to fix any issues, and the most effective project delivery methodology to ensure a smooth transition.

Identify Performance Problems

In some instances, data and performance issues can prevent you from upgrading your Salesforce org effectively. For example, a life sciences client of ours discovered during this plan-and-analyze phase that a lack of documentation and improper system design — along with integrity issues with historical data and a lack of Salesforce administration governance — would prevent them from adding Field Service Lightning and Service Cloud to their existing Sales Cloud implementation.

In this situation, starting over with a fresh implementation allowed the company to clean their data internally before they uploaded it to a new Salesforce org in its raw format. This ensured compatibility and allowed the company to include the Field Service Lightning and Service Cloud functionality in the new org.

Deciding how to approach an implementation — whether it’s an upgrade or replacement project — requires detailed discussions among business and technology leaders, end users and your implementation partners to address basic questions such as:

  • Why are we going through this implementation?
  • What are we trying to achieve?
  • Are we ready?
  • Have we gathered enough requirements internally to hand off to our consulting partner?

This stage will often uncover performance issues such as:

  • Redundant and manual processes to enter data
  • Data integrity issues that require time and manual effort to correct
  • Excessive triggers and apex code with little reference on business justification
  • Inappropriate usage of standard objects
  • Overuse of record types and custom fields
  • Lack of documentation for existing customizations

The project analysis phase also involves gathering requirements down to the user story level, which means defining what the key employees who use Salesforce do every day. A user story includes three main pieces of information: the user’s role, what they need to do, and the function they are supporting.

These user stories will help you make key decisions while the implementation is being designed, determine development milestones and, ultimately, understand whether an implementation is successful.

It’s also important to include change management and user training in your implementation plan. Helping users understand what’s being implemented, the problems and processes you’re working to correct, and the best way to use the new org will all be important in mitigating risks and driving user adoption.

You should also pay attention to leading industry practices — such as making sure a Field Service Lightning implementation is FDA compliant, and automating data exchange between your CRM and ERP systems — as well as any regulatory considerations. For instance, a medical device company must track the components it purchases and the products it ships, as well as products that are returned or repaired. Salesforce, Field Service Lightning and Service Cloud can help with these tracking requirements.

Choosing a Methodology

Once you determine the goals for your implementation project, the next step in the planning process is deciding how you’re going to approach the initiative.

In general terms, most technology implementations fall into one of two project management frameworks: waterfall or agile. The waterfall framework calls for defining your requirements in the earliest phase of a project, building and testing features to meet those requirements, training users, and deploying the software.

Waterfall offers a “big bang” approach that’s best suited for situations where you understand your organization’s requirements and don’t expect significant issues or design changes as the software is implemented.

The agile framework, in contrast, is more of a “define as you go” approach in which different elements of a project are developed in short periods known as sprints. A need is defined, features are developed and deployed, and the overall project continues with a focus on a different feature or aspect.

The agile framework is best suited for situations where the company’s requirements aren’t understood as clearly, or it needs to correct specific aspects before continuing with an upgrade. With some projects, problems emerge during the implementation phase that require the company and its implementation partner to change priorities mid-stream. The agile framework offers the flexibility to shift the project’s focus in response to newly discovered needs.

Some implementations are best served with a hybrid approach in which an overall design is developed under a waterfall methodology, but the implementation phase follows a series of agile sprints (this was a successful approach for our life sciences client’s implementation). This hybrid blends waterfall’s big-picture project design with the flexibility and frequent communication more typically associated with agile implementations.

Comparing these methods and choosing one (or the best features of both) provides an important framework that will, in turn, influence your project planning and the ultimate success of your Salesforce upgrade or replacement. To learn more about ensuring a successful Salesforce implementation, contact Consulting.Marketing@armaninollp.com

May 2, 2019

Migrating to Salesforce Lightning: The Fast, the Beautiful and the Unique

Posted by Lu Bai

Salesforce Lightning

Salesforce Lightning provides a modern, robust and productive user experience designed to help boost sales and services and support customers. It’s not just an interface, it’s all the features and functionalities built in a way to help deliver customer success.

As many of you have heard, migrating to Lightning has been a hot topic of conversation for the past few years. Salesforce will be taking a big step toward the future on Oct 12, 2019, by flipping the switch and making Lightning Experience the default for all users. 

In reality, of course, it’s not as easy as just flipping a switch. There are migration issues you need to watch out for, such as:

  • Custom URLs
  • Java buttons
  • Visualforce pages
  • Managed package compatibility

All these pieces of the puzzle can potentially delay the migration roadmap and will inevitably decrease Salesforce utilization. To avoid missing or overrating these problems, you need to review your business processes, examine the utilization level, and validate the level of effort indicated on the Salesforce readiness report (more details on that below) with actual use cases.

Lightning Fast: Buckle Up and Get Ready

Depending on how customized your Salesforce instance and business processes are, Lightning migration can range from two months to more than two years. A great starting point is a review of Salesforce’s Lightning Experience readiness report. It lays out the components that need to be migrated with an estimated level of effort, which serves as a great reference for planning out your migration journey.

As Lightning and Classic features evolve rapidly, keeping up with release notes and Lightning vs. Classic feature comparisons will help you prepare for the move.  Don’t forget Salesforce’s native training platform, Trailhead; from there you can find tons of Lightning-related modules and trails to familiarize yourself with Lightning’s features and help you get ready for the ride.

Lightning Beautiful: Always a Scenic Route

From a beautifully designed UI and page layouts, to customized components and Einstein (Wave) analytics reporting, Lightning shines everywhere. But during a migration, it’s easy to miss some of the things that make it so appealing.  To paint a full picture for your migration journey, you need a strategic plan that includes:  

  • A Project and budget plan with a timeline and milestones
  • Business process review
  • A predefined KPI on success metrics
  • User stories summarizing the who, what, when, how and why
  • A pilot group of “power users” to challenge, design, adopt and train
  • Change management to drive ultimate user adoption

Lightning Unique: Your Path, Your Way

Each Lightning migration journey is unique, with its own challenges and sparks. But don’t view it as a long, painful process conducting research, holding team meetings and custom developing your own solution. Instead, use this as an opportunity to revamp your Salesforce instance for better utilization and user adoption.

Our Saleforce experts can help you assess, plan and execute your migration to Lightning.  Contact us to learn more about optimizing your journey. 

July 26, 2018

Biotech Firm Sequences Software Implementations for a Quick Win

Posted by Armanino Financial Advisory Team

Biotech Firm Sequences Financial Close Automation & ERP for a Quick WinA clinical-stage biotechnology company had outgrown their ERP and wanted to upgrade to Microsoft Dynamics 365, but they were concerned that they didn’t have the time and manpower to devote to launching the new system. Armanino assessed their business requirements and pain points, and found that their finance team could free up resources for the ERP implementation by streamlining the month-end financial close process first.

Finance was devoting significant time to the financial close—manually reconciling balance sheet accounts, doing variance analysis and analyzing vendor accounts. Like many companies, they handled most of the process in static spreadsheets, so they had to deal with broken links and formulas, and touch files multiple times to certify and verify them.

Managing tasks with a manually updated checklist was also cumbersome. Since the checklist lacked version control, if one member of the team had it open, no one else could access it, which led to confusion over which tasks were still outstanding.

Another challenge was that the firm worked with a number of contract research organizations. Invoices for work from the third-party organizations, as well as accruals from clinical manufacturing and general business, needed to be matched to purchase orders. Manual matching caused lag time and often resulted in late adjustments, which would push the close even later.

With so much time focused on the financial close process, employees weren’t going to be able to devote enough time to implementing and learning to use the new ERP. The firm needed a quick way to get the ball rolling.

Sequencing the implementation

We suggested they add BlackLine, a cloud-based financial close automation solution, to their technology stack. Automating the financial close process would allow the finance team to spend less time on repetitive tasks and more time on high-value activities, like getting their new ERP up and running.

We determined the best approach was to implement the software solutions in stages.

First up was installing BlackLine’s Task Management functionality. With Task Management, the close lead can assign tasks to team members, and the integrated checklist automatically updates as work is completed. Real-time dashboards gave the CFO and controller increased visibility into the close process, so bottlenecks and delays could be immediately identified and addressed. Tracking all close activities on the platform kept the finance team from doing redundant work and minimized time spent waiting for other tasks to be finished, so they could focus on implementing the new ERP.

While BlackLine is platform-agnostic and can pull from any ERP, it has prebuilt integrations with numerous best-of-breed systems, including Dynamics 365. Because of this, it made sense to launch the ERP as the second stage of the implementation. With that connection in place, the flow of transaction information between the two systems could be automated, which would ensure data integrity and minimize human error.

Once connected to Dynamics 365, the final stage was activating the Account Reconciliation portion of Blackline, so that account reconciliation, transaction matching and variance analysis occurred on the platform.

BlackLine can automatically certify lightly used accounts or zero balance accounts, eliminating the need for a person to create a spreadsheet and sign off on the result. Automated transaction matching and exception handling minimizes tedious ticking and tying, and if a previously reconciled account changes because of activity elsewhere during the close, BlackLine will send a notification that the reconciled account has been decertified.

Starting strong with a quick win

Growing pains are inevitable when implementing new solutions, but staging the rollout of BlackLine and Microsoft Dynamics 365 allowed the biotech firm to gain the benefits of both platforms without the problems that often occur when you try to update the workflow of an already overtaxed staff. They recognized the amount of work necessary and avoided plugging in their new ERP without first giving employees the bandwidth to focus their efforts on integrating the new software.

Having the ability to implement aspects of BlackLine in stages gave the company greater control over change management, and fostered greater adoption and increased appreciation for the new tech stack, because employees had time to learn how to use it to lighten their daily workload. Building up momentum with the BlackLine implementation gave the company an early return on investment via a faster close, and prepared them for the wider launch of systems that will improve their financial management for years to come.

Discover how to strategically plan your organization’s cloud solutions strategy – from financial close automation and ERP to FP&A, CRM and beyond with Armanino’s Strategy & Transformation team.

December 10, 2012

AICPA Digital CPA Survey

Posted by Lindy Antonelli

Cloud CPA Survey “We’re at a defining moment in the accounting profession,” said Erik Asgeirsson, president and CEO of CPA2Biz. “It’s now possible for small- and medium-sized businesses to tap powerful technologies that make them more productive and offer faster, better insight into financial decision-making. But most of these companies need a tech-savvy business advisor to help them take advantage of these opportunities, and that’s a role CPAs are uniquely qualified to fill.”

A recent survey from AICPA polled 624 AICPA members representing a mix of small to large public accounting firms from Sept. 12-28. The Digital CPA Survey revealed insights into the use of cloud technology, the role of CPAs in adopting technology, barriers to adopting cloud solutions, and benefits of the cloud. We will share key findings from that survey and AICPA’s infographic below.

(more…)

August 21, 2020

Mitigating the Privacy Risks of Remote Working

Posted by Pippa Akem

Mitigating the Privacy Risks of Remote Working

As the flow of data increases between systems, locations, organizations and vendors, it’s important to prioritize privacy controls to mitigate privacy risks of the evolving pandemic landscape. If you haven’t established proper privacy controls already, now is a perfect time to implement them.

Key Risk Areas

Whether you need to create these controls from scratch or already have a sophisticated privacy control environment, the increasing number of remote employees has a profound impact on control strategies. As a result, it’s essential to ensure you have processes in place that provide insights into key risk areas related to remote-working environments. As you plan how to achieve successful privacy controls for remote workers, you can boil it down to six main considerations.

  1. Understanding the Risks

When more and more employees transition to remote work, sensitive data begins to flow through numerous locations — increasing the risk of your data being compromised. The first step your privacy team should perform is an impact assessment. You can do this internally or outsource it, depending on your business model. The important thing to keep in mind is that if your evaluation uncovers new risks, you should create new controls.

Impact assessments should be risk-based and focus on all the sensitive data you process. If confidential information is moving across new servers, networks and other mediums, there are likely new risks. Privacy risks considerations can extend into areas outside of your technological systems, such as an employee viewing protected health information at home while living with others who can potentially access it.

2. Clear Documentation

Once you finalize your new controls, it’s important to document them clearly. The benefits of straightforward documentation are twofold. First, remote employees can understand their tasks within the controls process and perform them in a standardized manner. Additionally, precise documentation of execution better enables auditors to verify their effectiveness.

Many processes used to require physical tasks (e.g., scanning, signoffs, filing), which now should be adapted to a remote environment. Digital document storage systems, certificates and other digital workflows can help streamline your remote documentation processes.

3. Communication

Your new policies and procedures are only useful if employees understand them and can access them whenever needed. For example, with in-person training no longer an option, they should shift to an online format without losing their effectiveness. After initial training, you can communicate your privacy controls through a company intranet, shared drive or another medium with anytime, anywhere access.

By communicating your controls through a single source of truth, your messaging is consistent. But again, you should consider that risk mitigation isn’t achieved only through technology. Management should align behind a uniform privacy governance strategy, too, so available information is unvarying across different teams.

4. Gaining Collective Buy-In

To inspire meaningful change and improve data security, management should not only align behind a single strategy but also believe in the importance of data security. Once management acknowledges how vital privacy controls are, that belief can waterfall throughout the organization with how they discuss it. This encourages employees to prioritize data security. You can even create incentive structures that drive execution and inspire everyone to treat sensitive data with the utmost care.

5. Monitoring

As your business continues to adapt your new privacy controls, monitoring their performance is critical. The first step is to develop metrics that can quantitatively measure performance. Once you’ve created metrics, your privacy team should develop feedback mechanisms and undergo periodic audits to evaluate performance continuously. After the assessments, report the results clearly and concisely to all stakeholders, so the efficacy of your controls are clearly communicated.

6. Continuous Improvement

Upon receiving measured results from the monitoring process, management should leverage these results to drive iterative improvement. Having feedback and reports flow into incremental changes will help fine-tune the controls and drive efficiency. It’s best to prioritize areas of development with the most privacy risks exposure during improvement initiatives.

Going Forward

As businesses transition to the remote environment, it’s easy to lose focus on compliance initiatives. Non-compliance isn’t only costly; it can have detrimental intangible consequences on your organization, like losing your customer’s trust, which can lead to tangible losses if those customers take their business elsewhere because they don’t trust you can keep their data safe.

Adjusting your privacy controls to a more digitized environment is only going to become more necessary as technology expands into even more facets of your organization and its operations. Ensuring your entire organization understands why you have these privacy controls, executes them effectively and continues to improve them, can ease your transition into a remote-working world without slowing down your efficiency.

If you’d like to discuss how to revise your privacy controls most effectively, Armanino’s privacy experts are offering a free consultation. Email Pippa.Akem@armaninoLLP.com to start your privacy transformation today!

May 11, 2020

Proof of Reserves: Elevating Standards of Trust and Transparency for Digital Asset Ecosystems

Posted by Noah Buxton

Trust and transparency for digital asset ecosystems. Proof of Reserves

Armanino’s new Proof of Reserves service provides improved trust and transparency between a digital asset exchange and their users. Here’s a look at how it works and why it was created.

Money and Trust

Money requires trust. And yes, bitcoin is money, although we’ll call it and other cryptocurrencies “digital assets” in this article.

Given their infancy and complexity, digital assets, more so than traditional financial instruments, require user trust for continued adoption and use. Yet the trust and transparency mechanisms available to users of traditional financial instruments are largely unavailable to users of digital assets.

In a nutshell, the problem for users of centralized digital asset exchanges — as well as custodians, loan platforms and other “service providers” that custody digital assets on behalf of their users — is that they are required to place great trust in the exchange without formalized and available mechanisms of trust and transparency.

Back-of-the-napkin calculations for the top 10 cryptocurrencies by market cap, assuming that exchanges hold a conservative 10% of those assets, would mean that exchanges currently custody about $16B-$20B worth of crypto. Again, with little to no meaningful transparency.

The solution: give exchanges a way to provide independent validation and proof that they have adequate reserves of a given digital asset to meet the “IOUs” to their customers. We call it a “proof of reserves assessment.”

Now exchanges and other custodians can provide proof of reserves under what we believe to be a model framework and methodology. Why is this the new model for proving digital asset reserves? Simple: because of the independence and reputation of the assessor, cryptographic proofs of user liabilities, and the ability for all users to verify that their balance was included while maintaining user privacy.

Background

Armanino LLP, a top 25 U.S. public accounting firm, has been conducting audits on cryptocurrency clients since 2014. As a necessary component of providing financial statement audit services for exchange clients, we encountered the challenges of auditing a bitcoin-heavy balance sheet. There were many lessons learned completing complex audit procedures to test ownership and existence of digital asset balances.

As a leader in digital asset assurance technology with a reputation for innovative solutions for the crypto space, we felt compelled to solve the “proof of reserves” problem.

In 2014, another leader in cryptocurrency, Greg Maxwell (now CTO at Blockstream), proposed a solution to proving an exchange’s reserves against its customer liabilities. We have adopted some features of this approach but also solved some of the key problems we believe have prevented it from being more widely used by exchanges, demanded by customers and required by regulators as a condition of licensure.

In May 2020, Armanino partnered with Gate.io, a top 20 exchange by daily volume, to complete the first external proof of reserves assessment.

Problems

Auditability, Trust and Transparency Issues

Blockchains offer the promise of immutability, decentralized trust, and auditability. However, current centralized exchanges, custodians and loan providers create blind spots in the ecosystem because liabilities of users are centralized as proprietary database entries controlled by the service provider. Additionally, it is commonplace, if not universal, that the customer’s digital assets are held in co-mingled wallets. Therefore, the promise of publicly available and auditable ledgers is impacted negatively.

Exchanges and other businesses that custody crypto assets in the U.S. are required to submit audited financial statements to state regulators in order to maintain state money transmission licenses. And while those audits provide a layer of trust and independent oversight, it is not necessarily true that audits provide needed transparency for end consumers. What’s more, the majority of digital asset trading volume (through marketplaces and exchanges) takes place outside the U.S., where regulatory oversight is generally less stringent.

Bad Incentives

While we believe that many exchange and custody businesses operate with the best interests of their customers in mind, there are lucrative incentives to “cheat” by only partially reserving against customer liabilities, loaning customer assets, or otherwise encumbering customer assets without the customer knowing or consenting. Take the case of Quadriga, a Canadian crypto exchange service provider that made headlines after the untimely death of the founder and later realizations that he had misappropriated user funds for his own benefit.

There is also the simple truth that exchanges and custodians remain “honey pots” for hackers, and incentives to not disclose hacks resulting in the loss of customer assets are very high. In these cases, the exchange would continue business with fractional reserves in hopes of refilling the coffers and not losing customers in the short term.

Defining “Adequate” Reserves

So, what is an “adequate” level of reserves for the exchange to hold against customer liabilities? Retail banks in the U.S. hold about $0.20 for every $1.00 of customer deposits, loaning out the other $0.80. This is a question our industry has yet to answer.[1] It seems to be the case that there is an unwritten rule that the exchange must reserve 100% against customer liabilities. At this time, there is a dearth of data which would allow us to make conclusions regarding how well the world’s exchanges and custodians are reserved.

It may be the case that customers, and even regulators, would deem a reserve of less than 100% to be acceptable. However, until providing proof of reserves becomes more commonplace, such a standard cannot emerge.

Non-Standard Approaches

As of May 2020, there are only a handful of examples of exchange providers completing a proof of reserves exercise. We are still in very early days. Of the exchanges that have performed a proof of reserves, there is a vast disparity in the methods and approaches utilized, the level of transparency provided for users to understand the methods used, the independence of the party performing the testing, etc. Furthermore, a proof of reserves assessment has not been performed by an independent third party, nor by an independent public accounting firm with a formal report on findings.

A lack of standards and independence in assessing reserves leads to confusion and bad outcomes for users.

The Solution: A New “Gold” Standard

Assets vs. Liabilities

Proof of reserves has two main components. First is validating that all the customer database entries are complete and accurate. We must obtain a reasonable level of assurance that (1) all accounts with non-zero balances are included (a measure of completeness), and (2) that the non-zero balance of bitcoins “held” or owed to that customer is correctly calculated by the exchange (a measure of accuracy).

Second, we must confirm with cryptographic certainty that the exchange has hot and cold wallets with enough bitcoins to meet all customer liabilities.

We have enumerated the specific procedures we performed for each of the two components outlined above in our formal report on the first proof of reserves assessment for Gate.io. (Watch this space for more thought leadership in this evolving area.)

Privacy-Preserving Proofs of User Liabilities

First, we provide a publicly accessible portal for users to “check balance” (the Verifier tool). This provides a very important check against the exchange under-reporting customer liabilities.

Here’s how we do that. After obtaining reasonable assurance that the exchange has produced a complete and accurate listing of hashed user IDs and associated BTC balances, we use that raw file to generate a Merkle Tree (a cryptographic proof that hashes the user ID and balance together, then combines it with the hash of the next user ID and balance, continuing on until there is a tree or pyramid of hashes, with a single “root” hash at the top).

The Merkle Tree can be made available to anyone without compromising privacy, as it is just hashed data; only the user with the known inputs (hashed user ID and their balance) can validate that their information was included in the proof. To simplify the user experience, we built a more user-friendly tool on TrustExplorer™ to allow users to interact with the Merkle Tree and Verifier tool.   

Proving the Exchange’s Control Over Assets

There are multiple important steps to proving ownership or control over digital assets (i.e., ability to exercise the private keys for a public address or wallet). First, we inquire with management and responsible individuals to obtain an understanding of the process for receiving customer bitcoins, safely storing customer bitcoins, and sending customer bitcoins outside the exchange when withdrawn by customers. We think of this generally as the process for custody and management of customer assets.

We also inquire with management and responsible individuals to obtain an understanding of the key technical components used in the custodial process. Important considerations here include, but are not limited to, the types of wallets used, and the hardware and software involved in hosting, maintaining, integrating and interacting with those wallets, as well as the signature schemes utilized (i.e. single-sig vs. multi-sig).

After gaining an understanding of the custody process and infrastructure, we collect a complete list of BTC addresses to test both balances and control over private keys.

For each of the hot and cold wallet addresses in scope, Armanino creates a custom message, including (1) a “nonce” or secret, and (2) the hash of the most recent bitcoin block. We then obtain a digital signature of that custom message from the exchange, and decrypt that signed message using the relevant public key/keys (wallet addresses) to determine that the exchange signed the message using the private keys for the wallets.

It is also possible to test ownership by movement of funds (signing a transaction with private keys), however, this approach is more open to the risk of cheating or colluding with a third party that actually controls the private keys. For this reason, our preferred methodology is to obtain signed messages simultaneously with the other procedures performed.

At the time we obtain signatures, we also pull balances from the bitcoin blockchain for each of the wallet addresses in scope.

The Frequency of Proof

Digital assets have more potential than non-natively digital assets as their existence and rights and obligations can be determined more readily. As discussed above, the standards for proof of reserves are just now emerging. There is room for flexibility in the frequency of proof of reserves assessments; exchanges and other custodians should choose a cadence of reporting that works best for their organization.

We are also very excited about the potential to provide proof of reserves in a real-time or near-real-time manner. More on that soon!

Conclusion

Proof of reserves assessments using the methods outlined here, on TrustExplorer and in our final report on Gate.io’s proof of reserve assessment provide a foundation for standardizing proof of reserves. We believe this is a large step forward for providing needed trust and transparency for customers of exchanges, as well as comfort for regulators across many jurisdictions. We are committed to working with our clients and the community to develop a best-and-only solution for trust and transparency over digital asset reserves. If you share the vision, please reach out!


[1] As one example, see Hong Kong’s SFC outlining 100% reserve requirements (98% in cold storage, 2% in hot wallets): https://www.sfc.hk/web/EN/files/ER/PDF/App%202_%20Conceptual%20framework%20for%20VA%20trading%20platform_eng.pdf


May 7, 2020

Armanino Joins Hedera20 Hackathon and Issues a Unique Challenge for Competitors

Posted by Noah Buxton

"Digital signatures can revolutionize how we exchange information and prove ownership. Armanino's Signing Challenge will be exciting for the Hedera developer community to create a foundational tool that can fuel applications across nearly every industry. I cannot wait to see what they build!" ~Cooper Kunz, Developer Advocate Lead, Hedera Hashgraph

Armanino is proud to partner and participate in Hedera Hashgraph’s Hedera20 Hackathon!  And we are issuing a unique challenge for the competition.

As an accounting and consulting firm, we know that auditability and trust fuel adoption of new tools.  Our challenge is for teams to build a robust, open-source signing service for Hedera wallets that enables users to prove account/wallet ownership to counterparties, management and auditors.  Winners will receive pairs of Bose QuietComfort II Headphones. (This is addition to the other prizes available, including a grand prize of $20,000 worth of HBAR.) 

You may ask yourself why an accounting and consulting firm would be involved in a hackathon?  It’s simple: Armanino’s relationship with clients doesn’t end at providing services, it begins there.

We are always looking for ways to make a positive impact in the lives of our clients, our people and our community.  With this in mind,  the potential to find interesting and diverse ways to partner with our clients and fuel innovation is limitless. 

The Hedera20 Hackathon is all about innovation and community collaboration. By providing creative and talented minds the opportunity to develop new solutions, this hackathon is an event we are proud to be a part of.

For more information, visit the Hedera20 Hackathon website.


« Previous PageNext Page »