Data Protection and Backup Solutions

Explore top LinkedIn content from expert professionals.

  • View profile for Lory Kehoe

    Aave Labs EU Director & Push Ireland CEO | Blockchain Ireland Founder & Chair | Trinity College Dublin Adjunct Asst. Prof. | Board Member

    54,704 followers

    HSBC's report on 'Asset Tokenisation in the Quantum Age: Future-proofing gold tokens with post-quantum security' 1. HSBC Leads with World-First Quantum-Secured Gold Tokenisation - HSBC became the first global bank to offer tokenised physical gold to both institutional and retail investors via its Orion digital asset platform. - In collaboration with Quantinuum, they’ve successfully trialled the world’s first quantum-secure tokenisation of gold, marking a pioneering move in post-quantum finance. 2. $16 Trillion Opportunity Meets Quantum Threat - Asset tokenisation is on track to become a $16 trillion market by 2030 (Boston Consulting Group (BCG), revolutionising how people invest in gold, real estate, bonds, and art. - But here's the catch: quantum computing threatens the cryptographic backbone of this entire ecosystem—forcing institutions to act now to secure digital assets for the future. 3. Post-Quantum Cryptography Without the Pain - HSBC deployed Post-Quantum Cryptography (PQC) via a PQC-secured VPN- offering a cost-effective, low-latency way to secure DLT networks without redesigning the entire system. - Their proof-of-concept showed no performance loss with transaction speeds hitting 40 TPS, and latency staying below 3.1 seconds. 4. Quantum Keys > Random Keys - Enter Quantum Random Number Generators (QRNGs): a next-gen security layer where randomness isn’t guessed—it’s quantum-proven. - HSBC’s solution boosts key strength and data unpredictability by integrating QRNGs that inject entropy directly into the Linux kernel, making encryption truly future-proof. 5. Interoperable, Cross-DLT, Retail-Ready - HSBC’s gold tokens can now move securely across blockchains, including conversion into ERC-20 tokens, enabling wider distribution across wallets and platforms. - Their system supports fractional gold ownership, opening doors for retail investors while maintaining institutional-grade security. So What? - Tokenisation is the future of finance—but quantum is a material risk - HSBC’s work sheds light on a possible - This potentially sets a new standard for digital asset infrastructure and serves as a blueprint for every financial institution looking to future-proof their tokenisation strategy. Great work Prashant Malik, Philip Intallura Ph.D, Duncan Jones, Kimberley Fewell, Mark Williamson, Del Rajan, Ben Merriman

  • View profile for Vaughan Shanks

    Helping security teams respond to cyber incidents better and faster | CEO & Co-Founder, Cydarm Technologies

    12,048 followers

    Last week #NIST released three post-#quantum #encryption standards. Why is this significant? Put simply, from a practical standpoint: risk management and compliance. First, on risk management: experts now say that quantum computing is less than a decade away. Quantum computers are expected to have the power to search large keyspaces very quickly, which means they will be able to decrypt current encryption. Moreover, it is entirely plausible that encrypted information recorded today is being stored for decryption when quantum computing becomes available. If you speculatively apply quantum-resistant encryption to your data now, you will reduce the risk of an adversary being able to successfully exploit your data when they have access to quantum computing. Second, on compliance: NIST is the governing body for standards in the USA, and many other nations take their encryption standards from NIST, as they do not have resources at the same scale as NIST. You can be certain that NIST-approved post-quantum algorithms will start being mentioned in various compliance checklists, as is the case currently with algorithms such as AES-256 and SHA-256. Note well that these algorithms have #FIPS numbers associated with them - meaning "Federal Information Processing Standard". Briefly, the approved algorithms are: 🔒 ML-KEM, for encrypted key exchange, as FIPS 203 🔒 ML-DSA, for digital signatures, as FIPS 204 🔒 SLH-DSA, for stateless hash-based digital signatures, as FIPS 205 There is a fourth algorithm, FN-DSA, also used for digital signatures, that is expected to be released in the next year.

  • View profile for Marcos Carrera

    💠 Chief Blockchain Officer | Tech & Impact Advisor | Convergence of AI & Blockchain | New Business Models in Digital Assets & Data Privacy | Token Economy Leader

    31,967 followers

    🚨🤖PhD saturday morning Tokenisation Facing the Quantum Abyss: My Analysis of the HSBC Case I’ve spent 20 years at the intersection of finance and tech, and if I’ve learned one thing, it’s that asset tokenisation (a projected $16 trillion opportunity ) has an Achilles' heel: quantum computing. The current security model ("Store Now, Decrypt Later" ) is a ticking time bomb for long-lived assets like gold or bonds. I just dissected the whitepaper by HSBC and Quantinuum on their "Gold Token". Here is my executive summary and, more importantly, the technical "gaps" every CTO must consider. 🚀 The Win: Pragmatism over Perfection Instead of a costly DLT re-engineering, they implemented a smart hybrid solution: PQC-VPN Overlay: They protected the transport layer (data in motion) with post-quantum cryptography without touching the ledger core. No Performance Impact: Most impressively, they kept latency and throughput (30-40 TPS) intact. Quantum Entropy: They hardened keys using QRNG (quantum generators) to avoid algorithmic predictability. ⚠️ The 3 Critical Gaps (and how to bridge them): Integrity vs. Confidentiality: The Flaw: The pilot secures the tunnel (VPN) and prioritizes confidentiality. However, it does not yet fully address the risk to digital signatures on the ledger itself; if a quantum actor breaks the signature scheme, they could forge transactions. The Solution: "Phase 2" must integrate post-quantum signatures (like ML-DSA/Dilithium) directly at the DLT application level. The Interoperability Risk: The Flaw: Conversion to ERC-20 for interoperability is highlighted. But the moment the asset touches a non-quantum public network (like Ethereum today), it loses its immunity. The Solution: Implement "Quantum Wrapped Tokens" that restrict holding only to wallets with verified PQC security. "Offline" Key Management: The Flaw: The entropy seed transfer was done "offline" (physically). This does not scale and represents a human operational risk. The Solution: Automate seed rotation or, ideally, use Quantum Key Distribution (QKD) to eliminate the human factor. My Verdict: HSBC has taken a vital first step to protect confidentiality today. But true quantum resistance requires protecting not just the "pipe" the data travels through, but the mathematical immutability of the asset itself. Is your organization waiting for NIST, or are you already protecting the transport layer? #FinTech #QuantumComputing #CyberSecurity #AssetTokenization #Blockchain #CISO #HSBC

  • View profile for Colin S. Levy
    Colin S. Levy Colin S. Levy is an Influencer

    General Counsel at Malbek | Author of The Legal Tech Ecosystem | I Help Legal Teams and Tech Companies Navigate AI, Legal Tech, and Digital Enablement | Fastcase 50

    51,473 followers

    As a lawyer who often dives deep into the world of data privacy, I want to delve into three critical aspects of data protection: A) Data Privacy This fundamental right has become increasingly crucial in our data-driven world. Key features include: -Consent and transparency: Organizations must clearly communicate how they collect, use, and share personal data. This often involves detailed privacy policies and consent mechanisms. -Data minimization: Companies should only collect data that's necessary for their stated purposes. This principle not only reduces risk but also simplifies compliance efforts. -Rights of data subjects: Under regulations like GDPR, individuals have rights such as access, rectification, erasure, and data portability. Organizations need robust processes to handle these requests. -Cross-border data transfers: With the invalidation of Privacy Shield and complexities around Standard Contractual Clauses, ensuring compliant data flows across borders requires careful legal navigation. B) Data Processing Agreements (DPAs) These contracts govern the relationship between data controllers and processors, ensuring regulatory compliance. They should include: -Scope of processing: DPAs must clearly define the types of data being processed and the specific purposes for which processing is allowed. -Subprocessor management: Controllers typically require the right to approve or object to any subprocessors, with processors obligated to flow down DPA requirements. -Data breach protocols: DPAs should specify timeframes for breach notification (often 24-72 hours) and outline the required content of such notifications, -Audit rights: Most DPAs now include provisions for audits and/or acceptance of third-party certifications like SOC II Type II or ISO 27001. C) Data Security These measures include: -Technical measures: This could involve encryption (both at rest and in transit), multi-factor authentication, and regular penetration testing. -Organizational measures: Beyond technical controls, this includes data protection impact assessments (DPIAs), appointing data protection officers where required, and maintaining records of processing activities. -Incident response plans: These should detail roles and responsibilities, communication protocols, and steps for containment, eradication, and recovery. -Regular assessments: This often involves annual security reviews, ongoing vulnerability scans, and updating security measures in response to evolving threats. These aren't just compliance checkboxes – they're the foundation of trust in the digital economy. They're the guardians of our digital identities, enabling the data-driven services we rely on while safeguarding our fundamental rights. Remember, in an era where data is often called the "new oil," knowledge of these concepts is critical for any organization handling personal data. #legaltech #innovation #law #business #learning

  • View profile for Peter Bordow

    Distinguished Engineer, Managing Director and PQC/Quantum Systems & Emerging Technologies R&D Leader for Cybersecurity at Wells Fargo

    6,194 followers

    I'm excited to share this Case Study for Quantum Entropy Injection into HSMs for Post Quantum Cryptographic (PQC) Key Generation that our amazing PQC team and I recently completed.   In cybersecurity, entropy is the measure of randomness in a string of bits. In cryptography, entropy is used to produce random numbers, which in turn are used to produce cryptographic keys. As entropy increases, randomness gets better, keys become more difficult to determine, and security improves. Entropy is also important for the generation of random numbers and other critical security parameters such as seeds, salts, and initialization vectors for cryptographic algorithms.   Financial institutions must deal with the constant risk of cyber-attacks, underlining the responsibility to maintain and strengthen digital security for customers’ trust and integrity. A foundational step for addressing these issues is generating stronger cryptographic keys with better entropy (as part of a broader Defense in Depth PQC strategy). Using random bits (from quantum sourced entropy) that are proven for improved randomness and unpredictability is pivotal for both today’s classical cryptography and tomorrow’s quantum resistant cryptography.   Wells Fargo, Thales, and Quantinuum, working in collaboration, demonstrated the ability to generate strong cryptographic keys within the cryptographic boundary of a Thales Luna HSM, a FIPS 140-2 level 3 cryptographic module with external entropy. The keys were generated using random bits with verified quantum entropy acquired from the Quantinuum Origin trapped ion-based quantum computer and validated using the Bell Test to prove it met the threshold for quantum entropy. This cryptographic solution gives Wells Fargo a proven quantum entropy source to generate ultra-secure keys that can be designed and deployed at scale.

  • View profile for Animesh Gaitonde

    SDE-3/Tech Lead @ Amazon, Ex-Airbnb, Ex-Microsoft

    15,459 followers

    Monzo Bank built Stand-In, a backup system to ensure resilient banking services to millions of customers. 🚀 🚀 Let's understand in simple words the architecture and trade-offs of the system. Monzo wants customers to continue banking despite disruptions in the software or cloud. What were the goals of the system ? 🎯 High availability - Zero downtime for banking operations. 🎯 Cost effectiveness - Minimize infra/backup costs. 🎯 Avoid single point of failure on a cloud vendor. Monzo operated their primary platform on AWS. And they built Stand-In on Google cloud. What did the architecture of the system look like ? 👉 Critical microservices - Bank transfers, card payments, and balance checks. 👉 Message queues - For data transfer between AWS and GCP. 👉 Managed databases - Syncing state from primary platform. During outages, customers were redirected to the Stand-In platform. This minimized the customer disruption. What trade-offs were made by the architecture ? 1️⃣ Eventual consistency - Was used to ensure high availability. But resulted in data store inconsistencies. 2️⃣ Cost effectiveness - Only a subset of critical services were deployed. This was unlike typical Disaster Recovery solutions. Stand-In only resulted in 1% additional cost over the primary cluster in AWS. What were some other challenges of Stand-In ? 🌐 End-to-end customer testing. 🌐 Reconciliation between AWS and GCP due to inconsistencies. 🌐 Interoperability while dealing with multi-cloud architectures. Monzo's Stand-In platform is a great example of how large companies build resilient distributed systems. 🔥 🔥 One of the key take-aways from the system design is prioritising critical features, minimizing the cost and providing high availability for customers. Have you dealt with multi-cloud deployments in the past ? If yes, share what challenges you faced in the comments below. 👇 #tech #softwareengineering #systemdesign

  • View profile for Michelle Harvey

    Independent ERP Consultant | Software Evaluation | Digital Transformation | Business and IT Systems Review I Project Management | Change Management

    11,596 followers

    𝗔𝗿𝗲 𝘆𝗼𝘂 𝗦𝗮𝗳𝗲𝗴𝘂𝗮𝗿𝗱𝗶𝗻𝗴 𝗣𝗲𝗿𝘀𝗼𝗻𝗮𝗹 𝗗𝗮𝘁𝗮 𝗶𝗻 𝘆𝗼𝘂𝗿 𝗘𝗥𝗣 𝗣𝗿𝗼𝗷𝗲𝗰𝘁? With the strong privacy laws in Australia, organizations must carefully manage personally identifiable information (PII) when converting data from legacy systems to new ERP, CRM, HR, and Payroll platforms. 𝗧𝗵𝗲 𝗗𝗮𝘁𝗮 𝗖𝗼𝗻𝘃𝗲𝗿𝘀𝗶𝗼𝗻 𝗖𝗵𝗮𝗹𝗹𝗲𝗻𝗴𝗲 During data conversion, information typically moves from older systems into staging areas before migration to the new environment. This process creates potential security vulnerabilities that must be planned and addressed proactively. 𝗖𝗿𝗶𝘁𝗶𝗰𝗮𝗹 𝗣𝗜𝗜 𝗦𝗲𝗰𝘂𝗿𝗶𝘁𝘆 𝗤𝘂𝗲𝘀𝘁𝗶𝗼𝗻𝘀 Every digital transformation team should address these essential questions: 1️⃣ 𝗪𝗵𝗼 𝗵𝗮𝘀 𝗮𝗰𝗰𝗲𝘀𝘀 𝘁𝗼 𝘀𝗲𝗻𝘀𝗶𝘁𝗶𝘃𝗲 𝗱𝗮𝘁𝗮? Access should be strictly limited to necessary personnel. 2️⃣ 𝗔𝗿𝗲 𝘁𝗲𝗮𝗺 𝗺𝗲𝗺𝗯𝗲𝗿𝘀 𝗽𝗿𝗼𝗽𝗲𝗿𝗹𝘆 𝘁𝗿𝗮𝗶𝗻𝗲𝗱 𝗼𝗻 𝗣𝗜𝗜 𝗿𝗲𝗴𝘂𝗹𝗮𝘁𝗶𝗼𝗻𝘀? All staff handling sensitive data must understand their legal responsibilities and compliance requirements. 3️⃣ 𝗛𝗼𝘄 𝗶𝘀 𝗣𝗜𝗜 𝘀𝗲𝗰𝘂𝗿𝗶𝘁𝘆 𝗺𝗮𝗶𝗻𝘁𝗮𝗶𝗻𝗲𝗱 𝘁𝗵𝗿𝗼𝘂𝗴𝗵𝗼𝘂𝘁 𝘁𝗵𝗲 𝗽𝗿𝗼𝗰𝗲𝘀𝘀? Data must be transmitted and stored using encrypted methods at all times. 4️⃣ 𝗔𝗿𝗲 𝗽𝗿𝗼𝗽𝗲𝗿 𝗰𝗼𝗺𝗺𝘂𝗻𝗶𝗰𝗮𝘁𝗶𝗼𝗻 𝗽𝗿𝗼𝘁𝗼𝗰𝗼𝗹𝘀 𝗶𝗻 𝗽𝗹𝗮𝗰𝗲? Never send PII through unsecured channels like standard email. 𝗦𝗲𝗰𝘂𝗿𝗶𝗻𝗴 𝗣𝗜𝗜 𝗗𝘂𝗿𝗶𝗻𝗴 𝗗𝗶𝗴𝗶𝘁𝗮𝗹 𝗧𝗿𝗮𝗻𝘀𝗳𝗼𝗿𝗺𝗮𝘁𝗶𝗼𝗻 The key challenge is preventing unauthorized movement of sensitive data by: ❇️ Implementing strict access controls on the repository, ensuring no accidental inherited rights. ❇️ Disabling download capabilities where appropriate. ❇️ Enabling viewing or manipulation only by the Data Management team. ❇️ Establishing clear data handling protocols (e.g. no hard copies). 𝗥𝗲𝗰𝗼𝗺𝗺𝗲𝗻𝗱𝗲𝗱 𝗦𝗲𝗰𝘂𝗿𝗶𝘁𝘆 𝗦𝗼𝗹𝘂𝘁𝗶𝗼𝗻𝘀 Secure File Transfer Protocol (SFTP) should be used to move data at all times. SharePoint can be one the one of the most effective tools for protecting PII during digital transformation projects, offering finely controlled access and robust security features.

  • View profile for Nagaswetha Mudunuri

    ISO 27001:2002 LA | AWS Community Builder | Building Secure digital environments as a Cloud Security Lead | Experienced in Microsoft 365 & Azure Security architecture | GRC

    9,479 followers

    🔐 Data in Use --Protection Strategies ⚠️ The Challenge When data is being processed in memory (RAM/CPU), it’s usually decrypted, which makes it vulnerable to: 💥 Insider threats 💥 Malware/memory scraping 💥 Cloud provider access ✅ Solutions for Data in Use 1. Homomorphic Encryption (HE) Data stays encrypted even during computation. Supports analytics, AI/ML, and calculations without exposing raw values. 💥 Use case: A hospital can run statistics on encrypted patient data without seeing individual records. Downside: Very slow for large-scale real-time workloads (still improving). 2. Secure Enclaves / Trusted Execution Environments (TEEs) Hardware-based isolation → a secure “enclave” inside the CPU where data is decrypted and processed. Even the system admin or cloud provider cannot see inside. ✨ Examples: 💥 Intel SGX 💥 AMD SEV 💥 AWS Nitro Enclaves → lets you isolate EC2 instances for secure key management, medical data processing, payment transactions, etc. 💥 Use case: A bank can run fraud detection models on sensitive financial data in the cloud without exposing it to AWS staff. 3. Confidential Computing Broader concept: combines TEEs, encrypted memory, and sometimes HE. Ensures that data remains protected throughout its lifecycle (rest, transit, use). ✨ Cloud examples: 💥 AWS Nitro Enclaves 💥 Azure Confidential Computing 💥 Google Confidential VMs 4. Secure Multi-Party Computation (MPC) Multiple parties compute a function jointly without revealing their private inputs. Often used in cryptocurrency custody, federated learning, and zero-knowledge proofs. 💥 Example: Banks collaboratively detect fraud patterns without sharing customer records. #learnwithswetha #encryption #datainuse #learning #dataprotection #privacy

  • View profile for Swarnali Singha
    Swarnali Singha Swarnali Singha is an Influencer

    Co-founder & CBO @ZERON | Cyber Risk Decision Intelligence | Agentic AI | Cyber Risk Quantification

    6,282 followers

    The tension between maximizing data utility and upholding stringent privacy is a defining challenge. How can we leverage sensitive information for analytics, AI training, or collaborative research without ever exposing the raw data itself? Homomorphic Encryption (HE)—a cryptographic approach that promises to solve this dilemma. Imagine performing computations directly on encrypted data, without any need for decryption. It's like giving someone a locked box, letting them process its contents, and getting a new locked box back, all without them ever seeing what's inside. Where could this technology revolutionize data privacy? ✅ Cloud Computing: Securely outsourcing powerful analytics or privacy-preserving AI/ML model training to untrusted cloud environments, maintaining data confidentiality end-to-end. ✅ Healthcare & Genomics: Facilitating collaborative medical research across institutions on encrypted patient records or genomic data, accelerating breakthroughs without compromising individual privacy. ✅ Financial Services: Enabling fraud detection, risk assessments, or credit scoring by analyzing encrypted financial transactions, ensuring regulatory compliance and protecting sensitive customer portfolios. ✅ Government & Defense: Enabling secure intelligence sharing and processing of classified data in multi-party or untrusted environments. However, the challenges are: 🔴 Performance Overhead: Current HE schemes are computationally intensive. Operations on encrypted data are significantly slower and resource-heavy compared to plaintext operations, making real-time applications a hurdle. 🔴 Complexity: Implementing and securely managing HE systems requires deep cryptographic expertise, posing a barrier for many organizations. The learning curve for developers is steep. 🔴 Data Expansion: Encrypted data often becomes significantly larger than its original plaintext, leading to increased storage and bandwidth requirements. 🔴 Limited Operations (Historically): While strides have been made, not all complex operations are equally efficient or even possible with current HE schemes. It's a highly specialized toolkit. 🔴 Bootstrapping: A key technique required to "refresh" noisy ciphertexts to allow for more complex computations, but it's one of the most computationally expensive steps. Despite these hurdles, the progress in libraries like SEAL, HElib, and TFHE is truly remarkable. It promises a future where data utility and privacy can coexist. What are your thoughts on Homomorphic Encryption's potential impact on cybersecurity and data privacy? #DataSecurity #Encryption #HomomorphicEncryption #SecureData #DataPrivacy #CyberSecurity #SecureProcessing #CloudComputing #TechInnovation #DataProtection

  • View profile for Antonio Grasso
    Antonio Grasso Antonio Grasso is an Influencer

    Technologist & Global B2B Influencer | Founder & CEO | LinkedIn Top Voice | Driven by Human-Centricity

    42,138 followers

    In an era where data sharing is essential and concerning, six fundamental techniques are emerging to protect privacy while enabling valuable insights. Fully Homomorphic Encryption involves encrypting data before being shared, allowing analysis without decoding the original information, thus safeguarding sensitive details. Differential Privacy adds noise variables to a dataset, making decoding the initial inputs impossible, maintaining privacy while allowing generalized analysis. Functional Encryption provides selected users a key to view specific parts of the encrypted text, offering relevant insights while withholding other details. Federated Analysis allows parties to share only the insights from their analysis, not the data itself, promoting collaboration without direct exposure. Zero-Knowledge Proofs enable users to prove their knowledge of a value without revealing it, supporting secure verification without unnecessary exposure. Secure Multi-Party Computation distributes data analysis across multiple parties, so no single entity can see the complete set of inputs, ensuring a collaborative yet compartmentalized approach. Together, these techniques pave the way for a more responsible and secure data management and analytics future. #privacy #dataprotection

Explore categories