Search
00
GBAF Logo
trophy
Top StoriesInterviewsBusinessFinanceBankingTechnologyInvestingTradingVideosAwardsMagazinesHeadlinesTrends

Subscribe to our newsletter

Get the latest news and updates from our team.

Global Banking and Finance Review

Global Banking & Finance Review

Company

    GBAF Logo
    • About Us
    • Profile
    • Privacy & Cookie Policy
    • Terms of Use
    • Contact Us
    • Advertising
    • Submit Post
    • Latest News
    • Research Reports
    • Press Release
    • Awards▾
      • About the Awards
      • Awards TimeTable
      • Submit Nominations
      • Testimonials
      • Media Room
      • Award Winners
      • FAQ
    • Magazines▾
      • Global Banking & Finance Review Magazine Issue 79
      • Global Banking & Finance Review Magazine Issue 78
      • Global Banking & Finance Review Magazine Issue 77
      • Global Banking & Finance Review Magazine Issue 76
      • Global Banking & Finance Review Magazine Issue 75
      • Global Banking & Finance Review Magazine Issue 73
      • Global Banking & Finance Review Magazine Issue 71
      • Global Banking & Finance Review Magazine Issue 70
      • Global Banking & Finance Review Magazine Issue 69
      • Global Banking & Finance Review Magazine Issue 66
    Top StoriesInterviewsBusinessFinanceBankingTechnologyInvestingTradingVideosAwardsMagazinesHeadlinesTrends

    Global Banking & Finance Review® is a leading financial portal and online magazine offering News, Analysis, Opinion, Reviews, Interviews & Videos from the world of Banking, Finance, Business, Trading, Technology, Investing, Brokerage, Foreign Exchange, Tax & Legal, Islamic Finance, Asset & Wealth Management.
    Copyright © 2010-2025 GBAF Publications Ltd - All Rights Reserved.

    Editorial & Advertiser disclosure

    Global Banking and Finance Review is an online platform offering news, analysis, and opinion on the latest trends, developments, and innovations in the banking and finance industry worldwide. The platform covers a diverse range of topics, including banking, insurance, investment, wealth management, fintech, and regulatory issues. The website publishes news, press releases, opinion and advertorials on various financial organizations, products and services which are commissioned from various Companies, Organizations, PR agencies, Bloggers etc. These commissioned articles are commercial in nature. This is not to be considered as financial advice and should be considered only for information purposes. It does not reflect the views or opinion of our website and is not to be considered an endorsement or a recommendation. We cannot guarantee the accuracy or applicability of any information provided with respect to your individual or personal circumstances. Please seek Professional advice from a qualified professional before making any financial decisions. We link to various third-party websites, affiliate sales networks, and to our advertising partners websites. When you view or click on certain links available on our articles, our partners may compensate us for displaying the content to you or make a purchase or fill a form. This will not incur any additional charges to you. To make things simpler for you to identity or distinguish advertised or sponsored articles or links, you may consider all articles or links hosted on our site as a commercial article placement. We will not be responsible for any loss you may suffer as a result of any omission or inaccuracy on the website.

    Home > Technology > What’s Missing From DLP
    Technology

    What’s Missing From DLP

    What’s Missing From DLP

    Published by Gbaf News

    Posted on February 25, 2012

    Featured image for article about Technology

    By David Gibson, director of strategy at Varonis www.varonis.com

    In most organizations today, there is sensitive data that is overexposed and vulnerable to misuse or theft, leaving IT in an ongoing race to prevent data loss.  Packet sniffers, firewalls, virus scanners, and spam filters are doing a good job securing the borders, but what about insider threats? The threat of legitimate, authorized users unwittingly (or wittingly) leaking critical data just by accessing data that is available to them is all too real.  Analyst firms such as IDC estimate that in 5 years, unstructured data, which makes up 80% of organizational data, will grow by 650%. The risk of data loss is increasing above and beyond this explosive rate, as more dynamic, cross-functional teams collaborate and data is continually transferred between network shares, email accounts, SharePoint sites, mobile devices, and other platforms.  As a result, security professionals are turning to data loss prevention (DLP) solutions for help.  Unfortunately, organizations are finding that these DLP solutions in many cases fail to fully protect critical data because they focus on symptomatic, perimeter-level solutions to a much deeper problem — the fact that users have inappropriate or excessive rights to sensitive information.david gibson

    DLP Alone is Not a Panacea

    DLP solutions primarily focus on classifying sensitive data and preventing its transfer with a three-pronged technology approach:

    • Endpoint protections encrypt data on hard drives and disable external storage to stop data from escaping via employee laptops and workstations.
    • Network protections scan and filter sensitive data to prevent it from leaving the organization via email, HTTP, FTP and other protocols.
    • Server protections focus on content classification and identifying sensitive files that need to be protected before they have a chance to escape.

    This approach works well if an organization knows who owns all the sensitive data and who’s using it. Since that is almost never the case, once the sensitive data is identified, which in the average size organization can takes months, IT is left with the monumental job of finding out: Who the sensitive data belongs to? Who has and should have access to it? Who is using it? These questions must be answered in order to identify the highest priority sensitive data (i.e. the data-in-use) and to determine the appropriate data loss prevention procedures.

    Early solutions that focused primarily on endpoint and network protections were quickly overwhelmed by the massive amounts of data traversing countless networks and devices. Unfortunately, DLP’s file-based approach to content classification is cumbersome at best.  Upon implementing DLP it is not uncommon to have tens of thousands of “alerts” about sensitive files. The challenge doesn’t stop here.  Select an alert at random – the sensitive files involved may have been auto-encrypted and auto-quarantined, but what comes next?  Who has the knowledge and authority to decide the appropriate access controls?  Who are we now preventing from doing their jobs?  How and why were the files placed here in the first place?

    DLP solutions provide very little context about data usage, permissions, and ownership, making it difficult for IT to proceed with sustainable remediation. IT does not have the information available to them to make decisions about accessibility and acceptable use on their own; and even if the information was available, it is not realistic to make these kinds of decisions for each and every file.

    The reality is that sensitive files are being used to achieve important business objectives – digital collaboration is essential for organizations to function successfully. But, in order to do this, sensitive data must be stored somewhere that allows people to collaborate with it while at the same time ensuring that only the right people have access and that their use of sensitive data is monitored.

    Context Is King
    When an incident occurs or an access control issue is detected, organizations shouldn’t be required to turn their business into a panic room.  Rather, solutions to prevent data loss need to enable the personnel with the most knowledge about the data, the data owners, to take the appropriate action to remediate risks quickly, in the right order.  To do this, organizations need enterprise context awareness – i.e., knowledge of who owns the data, who uses the data, and who should and shouldn’t have access.

    Managing and protecting sensitive information requires an ongoing, repeatable process. The analyst firm, Forrester refers to this as protecting information consistently with identity context (PICWIC).

    The central idea of PICWIC is that data is assigned to business owners at all times.  When identity context is combined with data management, organizations can provision new user accounts with correct levels of access, recertify access entitlements regularly, and take the appropriate actions when an employee changes roles or is terminated.  By following the PICWIC best practices, the chances of accidental data leakage are dramatically reduced while lifting a substantial burden from IT.

    Advanced and Comprehensive DLP
    The concept of PICWIC and the resulting policies and procedures that it enables are very promising, but how to implement PICWIC and improve DLP implementations?  The key to providing the necessary context lies in metadata: To collect and analyze required metadata non-intrusively, to automate workflows and auto-generate reports, and have a reliable operational plan to follow.  With the recent advancements in metadata technology, data governance software is providing organizations with the ability to improve DLP implementations by not only automating the process of identifying sensitive data, but also simultaneously showing what data is in use and who is using it – i.e. provide the needed context for comprehensive DLP. By non-intrusively, continuously collecting critical metadata such as permissions, user and group activity, access and sensitivity and then synthesizing this information – data governance software provides visibility never before available with traditional DLP implementations. When data governance software is used in conjunction with traditional DLP software, implementations move faster and sensitive data is more accurately identified and protected.

    With over 23 million records containing personally identifiable information (PII) (source: privacyrights.org) leaked in 2011 alone, it is more important than ever for organizations to ensure sensitive data is secure.  Regulations such as the European Union’s recent decision to fine businesses breaching their privacy rules up to two percent of their global turnover make it an imperative for organizations to ensure their DLP practices are quick, comprehensive and continuous.

    Integrating data governance software automation into existing or new DLP implementations not only ensures sensitive data is secure, but it also provides a speed and scale that traditional DLP cannot achieve. Because data governance software automatically adjusts as changes file structures and activity profiles occur, access controls to shared data are always current and based on business needs. As a result the fundamental step to data loss prevention is addressed: Limiting what data makes its way to laptops, printers and USB drives in the first place. That way, efforts to further protect data via filtering, encryption, etc., can be focused more efficiently on only those items that are valuable, sensitive and actively being accessed.

    About Varonis:
    Varonis is the leader in unstructured and semi-structured data governance for file systems, SharePoint and NAS devices, and Exchange servers. The company was named “Cool Vendor” in Risk Management and Compliance by Gartner, and voted one of the “Fast 50 Reader Favorites” on FastCompany.com. Varonis has over 3,000 installations worldwide. Based on patented technology and a highly accurate analytics engine, Varonis’ solutions give organisations  total visibility and control over their data, ensuring that only the right users have access to the right data at all times, all use is monitored, and potential abuse is flagged. Varonis is headquartered in New York, with regional offices in Europe, Asia and Latin America, and research and development offices in Hertzliya, Israel.

    Varonis, the Varonis logo, DatAdvantage and DataPrivilege are registered trademarks of Varonis Systems in the United States and/or other countries and Data Classification Framework and Metadata Framework are under a registration process in the United States and/or other countries. All other product and company names and marks mentioned in this document are the property of their respective owners and are mentioned for identification purposes only. www.varonis.com

    Press contact:
    Regine Hartmann
    Eskenzi PR Ltd.
    Tel: +44 20 7183 2834
    Email: regine@eskenzipr.com
    www.eskenzipr.com

    By David Gibson, director of strategy at Varonis www.varonis.com

    In most organizations today, there is sensitive data that is overexposed and vulnerable to misuse or theft, leaving IT in an ongoing race to prevent data loss.  Packet sniffers, firewalls, virus scanners, and spam filters are doing a good job securing the borders, but what about insider threats? The threat of legitimate, authorized users unwittingly (or wittingly) leaking critical data just by accessing data that is available to them is all too real.  Analyst firms such as IDC estimate that in 5 years, unstructured data, which makes up 80% of organizational data, will grow by 650%. The risk of data loss is increasing above and beyond this explosive rate, as more dynamic, cross-functional teams collaborate and data is continually transferred between network shares, email accounts, SharePoint sites, mobile devices, and other platforms.  As a result, security professionals are turning to data loss prevention (DLP) solutions for help.  Unfortunately, organizations are finding that these DLP solutions in many cases fail to fully protect critical data because they focus on symptomatic, perimeter-level solutions to a much deeper problem — the fact that users have inappropriate or excessive rights to sensitive information.david gibson

    DLP Alone is Not a Panacea

    DLP solutions primarily focus on classifying sensitive data and preventing its transfer with a three-pronged technology approach:

    • Endpoint protections encrypt data on hard drives and disable external storage to stop data from escaping via employee laptops and workstations.
    • Network protections scan and filter sensitive data to prevent it from leaving the organization via email, HTTP, FTP and other protocols.
    • Server protections focus on content classification and identifying sensitive files that need to be protected before they have a chance to escape.

    This approach works well if an organization knows who owns all the sensitive data and who’s using it. Since that is almost never the case, once the sensitive data is identified, which in the average size organization can takes months, IT is left with the monumental job of finding out: Who the sensitive data belongs to? Who has and should have access to it? Who is using it? These questions must be answered in order to identify the highest priority sensitive data (i.e. the data-in-use) and to determine the appropriate data loss prevention procedures.

    Early solutions that focused primarily on endpoint and network protections were quickly overwhelmed by the massive amounts of data traversing countless networks and devices. Unfortunately, DLP’s file-based approach to content classification is cumbersome at best.  Upon implementing DLP it is not uncommon to have tens of thousands of “alerts” about sensitive files. The challenge doesn’t stop here.  Select an alert at random – the sensitive files involved may have been auto-encrypted and auto-quarantined, but what comes next?  Who has the knowledge and authority to decide the appropriate access controls?  Who are we now preventing from doing their jobs?  How and why were the files placed here in the first place?

    DLP solutions provide very little context about data usage, permissions, and ownership, making it difficult for IT to proceed with sustainable remediation. IT does not have the information available to them to make decisions about accessibility and acceptable use on their own; and even if the information was available, it is not realistic to make these kinds of decisions for each and every file.

    The reality is that sensitive files are being used to achieve important business objectives – digital collaboration is essential for organizations to function successfully. But, in order to do this, sensitive data must be stored somewhere that allows people to collaborate with it while at the same time ensuring that only the right people have access and that their use of sensitive data is monitored.

    Context Is King
    When an incident occurs or an access control issue is detected, organizations shouldn’t be required to turn their business into a panic room.  Rather, solutions to prevent data loss need to enable the personnel with the most knowledge about the data, the data owners, to take the appropriate action to remediate risks quickly, in the right order.  To do this, organizations need enterprise context awareness – i.e., knowledge of who owns the data, who uses the data, and who should and shouldn’t have access.

    Managing and protecting sensitive information requires an ongoing, repeatable process. The analyst firm, Forrester refers to this as protecting information consistently with identity context (PICWIC).

    The central idea of PICWIC is that data is assigned to business owners at all times.  When identity context is combined with data management, organizations can provision new user accounts with correct levels of access, recertify access entitlements regularly, and take the appropriate actions when an employee changes roles or is terminated.  By following the PICWIC best practices, the chances of accidental data leakage are dramatically reduced while lifting a substantial burden from IT.

    Advanced and Comprehensive DLP
    The concept of PICWIC and the resulting policies and procedures that it enables are very promising, but how to implement PICWIC and improve DLP implementations?  The key to providing the necessary context lies in metadata: To collect and analyze required metadata non-intrusively, to automate workflows and auto-generate reports, and have a reliable operational plan to follow.  With the recent advancements in metadata technology, data governance software is providing organizations with the ability to improve DLP implementations by not only automating the process of identifying sensitive data, but also simultaneously showing what data is in use and who is using it – i.e. provide the needed context for comprehensive DLP. By non-intrusively, continuously collecting critical metadata such as permissions, user and group activity, access and sensitivity and then synthesizing this information – data governance software provides visibility never before available with traditional DLP implementations. When data governance software is used in conjunction with traditional DLP software, implementations move faster and sensitive data is more accurately identified and protected.

    With over 23 million records containing personally identifiable information (PII) (source: privacyrights.org) leaked in 2011 alone, it is more important than ever for organizations to ensure sensitive data is secure.  Regulations such as the European Union’s recent decision to fine businesses breaching their privacy rules up to two percent of their global turnover make it an imperative for organizations to ensure their DLP practices are quick, comprehensive and continuous.

    Integrating data governance software automation into existing or new DLP implementations not only ensures sensitive data is secure, but it also provides a speed and scale that traditional DLP cannot achieve. Because data governance software automatically adjusts as changes file structures and activity profiles occur, access controls to shared data are always current and based on business needs. As a result the fundamental step to data loss prevention is addressed: Limiting what data makes its way to laptops, printers and USB drives in the first place. That way, efforts to further protect data via filtering, encryption, etc., can be focused more efficiently on only those items that are valuable, sensitive and actively being accessed.

    About Varonis:
    Varonis is the leader in unstructured and semi-structured data governance for file systems, SharePoint and NAS devices, and Exchange servers. The company was named “Cool Vendor” in Risk Management and Compliance by Gartner, and voted one of the “Fast 50 Reader Favorites” on FastCompany.com. Varonis has over 3,000 installations worldwide. Based on patented technology and a highly accurate analytics engine, Varonis’ solutions give organisations  total visibility and control over their data, ensuring that only the right users have access to the right data at all times, all use is monitored, and potential abuse is flagged. Varonis is headquartered in New York, with regional offices in Europe, Asia and Latin America, and research and development offices in Hertzliya, Israel.

    Varonis, the Varonis logo, DatAdvantage and DataPrivilege are registered trademarks of Varonis Systems in the United States and/or other countries and Data Classification Framework and Metadata Framework are under a registration process in the United States and/or other countries. All other product and company names and marks mentioned in this document are the property of their respective owners and are mentioned for identification purposes only. www.varonis.com

    Press contact:
    Regine Hartmann
    Eskenzi PR Ltd.
    Tel: +44 20 7183 2834
    Email: regine@eskenzipr.com
    www.eskenzipr.com

    Related Posts
    Treasury transformation must be built on accountability and trust
    Treasury transformation must be built on accountability and trust
    Financial services: a human-centric approach to managing risk
    Financial services: a human-centric approach to managing risk
    LakeFusion Secures Seed Funding to Advance AI-Native Master Data Management
    LakeFusion Secures Seed Funding to Advance AI-Native Master Data Management
    Clarity, Context, Confidence: Explainable AI and the New Era of Investor Trust
    Clarity, Context, Confidence: Explainable AI and the New Era of Investor Trust
    Data Intelligence Transforms the Future of Credit Risk Strategy
    Data Intelligence Transforms the Future of Credit Risk Strategy
    Architect of Integration Ushers in a New Era for AI in Regulated Industries
    Architect of Integration Ushers in a New Era for AI in Regulated Industries
    How One Technologist is Building Self-Healing AI Systems that Could Transform Financial Regulation
    How One Technologist is Building Self-Healing AI Systems that Could Transform Financial Regulation
    SBS is Doubling Down on SaaS to Power the Next Wave of Bank Modernization
    SBS is Doubling Down on SaaS to Power the Next Wave of Bank Modernization
    Trust Embedding: Integrating Governance into Next-Generation Data Platforms
    Trust Embedding: Integrating Governance into Next-Generation Data Platforms
    The Guardian of Connectivity: How Rohith Kumar Punithavel Is Redefining Trust in Private Networks
    The Guardian of Connectivity: How Rohith Kumar Punithavel Is Redefining Trust in Private Networks
    BNY Partners With HID and SwiftConnect to Provide Mobile Access to its Offices Around the Globe With Employee Badge in Apple Wallet
    BNY Partners With HID and SwiftConnect to Provide Mobile Access to its Offices Around the Globe With Employee Badge in Apple Wallet
    How Integral’s CTO Chidambaram Bhat is helping to solve  transfer pricing problems through cutting edge AI.
    How Integral’s CTO Chidambaram Bhat is helping to solve transfer pricing problems through cutting edge AI.

    Why waste money on news and opinions when you can access them for free?

    Take advantage of our newsletter subscription and stay informed on the go!

    Subscribe

    More from Technology

    Explore more articles in the Technology category

    Why Physical Infrastructure Still Matters in a Digital Economy

    Why Physical Infrastructure Still Matters in a Digital Economy

    Why Compliance Has Become an Engineering Problem

    Why Compliance Has Become an Engineering Problem

    Can AI-Powered Security Prevent $4.2 Billion in Banking Fraud?

    Can AI-Powered Security Prevent $4.2 Billion in Banking Fraud?

    Reimagining Human-Technology Interaction: Sagar Kesarpu’s Mission to Humanize Automation

    Reimagining Human-Technology Interaction: Sagar Kesarpu’s Mission to Humanize Automation

    LeapXpert: How financial institutions can turn shadow messaging from a risk into an opportunity

    LeapXpert: How financial institutions can turn shadow messaging from a risk into an opportunity

    Intelligence in Motion: Building Predictive Systems for Global Operations

    Intelligence in Motion: Building Predictive Systems for Global Operations

    Predictive Analytics and Strategic Operations: Strengthening Supply Chain Resilience

    Predictive Analytics and Strategic Operations: Strengthening Supply Chain Resilience

    How Nclude.ai   turned broken portals into completed applications

    How Nclude.ai turned broken portals into completed applications

    The Silent Shift: Rethinking Services for a Digital World?

    The Silent Shift: Rethinking Services for a Digital World?

    Culture as Capital: How Woxa Corporation Is Redefining Fintech Sustainability

    Culture as Capital: How Woxa Corporation Is Redefining Fintech Sustainability

    Securing the Future: We're Fixing Cyber Resilience by Finally Making Compliance Cool

    Securing the Future: We're Fixing Cyber Resilience by Finally Making Compliance Cool

    Supply chain security risks now innumerable and unmanageable for majority of cybersecurity leaders, IO research reveals

    Supply chain security risks now innumerable and unmanageable for majority of cybersecurity leaders, IO research reveals

    View All Technology Posts
    Previous Technology PostRed Hat Updates Messaging, Realtime and Grid Platform
    Next Technology PostEnd of Cash