Monday, May 15, 2017

Data Protection Coexisting with Consumer Privacy


Data privacy concern is a prevailing topic.  This piece, is the first part of a series that we will share that will address what clients wrestle with regarding their delivery and administration of technology for their own clients and their day-to-day operations.  Technology will continue to influence our daily lives.  With some of the benefits comes challenges.  The benefits allow more freedoms and increase efficiencies for the user and consumer.  The challenges are present for both the consumer and the operator administrator of the technological innovation.  The questions gnaw. Can the data be kept and transmitted safely? Can the data be kept private as required?  With the nuanced articles on the subject, these questions are what it should be the gravamen.  Can data be kept safe and can it be kept private as it is administered daily in the delivery of services?

This also is about having trust in the development and the reliability of systems, networks, programs, hardware, and the personnel administering them all; and by the way, trust on the combination of technological solution and integration to deliver the needed safety and ultimate privacy.  As such, the trust then has value.  Value in trust is to both the individual expecting privacy and to the entity responsible for maintaining safety and the privacy as it is a custodian of the data, especially as the data is augmented daily with users use of services. 

The trust element in innovation and administration of the technology to maintain data safe and private increases in importance as reliance on technology increases in society.  To address that growing reliance there needs to be a basis, a standard, or a measure by which regulators and the courts can decide when that basis, standard, or measure was not met.  Of course, that’s for litigation hit.  But it is also for pointing to a goal on how data protection will coexist with consumer privacy, as it is expected, though it is unsaid.

The idea that is counseled to entities that endeavor in this realm, as it applies to the purveyor of technology, administrator of cloud systems, data aggregators, network designer, source code writer, is that privacy is the end game buttressed by being accountable.  The end game of privacy is in the data maintained, transmitted, accessed, and delivered to the correctly identified and intended recipient.  Accountability is the ingredient in pursuit of new ways and programs developed that innovate into that coexistence of data protection and consumer privacy.  Accountability in that process is essential to meet with the reliance of consumer trust and confidence. This accountability should be based on checks and balances in the approach for the development, application, and delivery of daily data services.  This accountability, as well must be an organizational task instilled in management and throughout the organization down the service line.  As client addresses this issue they realize it’s their decision-making process that needs addressing as they focus on privacy of the data they are administering or the result of the program designed.  Their approach needs to surpass the short-sighted view of regulatory compliance.

The experience demonstrates that as clients find themselves needing to address their decision-making regarding privacy, data protection, consumer expectations and trust, their accountability, they realize that their culture is changed.  Policies then are the result of that cultural accountability way of thinking that emphasizes privacy.  Privacy programs within organizations will be seen to require levels of cooperation.  The internal aspect of the change should drive the entity to instill data governance, risk assessment, and needs evaluation.  The approach should be one of data governance that syncs legal, operations, contract management, information technology, and human resources, to name a few, together for the levels to see eye-to-eye on data protection and its relationship to consumer privacy and the consumers' trust component.
The relationship of data protection to data privacy is usually seen in application as the technological and administrative existence of the means by which digital information is operatively used, shared, described, analyzed, etc.  That focal point is more than just administering who has access and can anyone else be able to decipher the data in transmission amid networks.  The challenges to the necessary coexistence of data protection and consumer privacy are not for academic theorizing or esoteric concepts.   They are hard and force honest considerations to the applications and efforts to achieve the privacy and protection desired and expected.  Other issues will be discussed on the next piece.

Originally published www.lorenzolawfirm.com March 8, 2017
Lorenzo Law Firm is “Working to Protect your Business, Ideas, and Property on the Web." Copyright 2017, all rights reserved Lorenzo Law Firm, P.A.

Thursday, May 11, 2017

Internet Tech Terms in Use


Internet tech terms are commonly mentioned in a law firm’s discussions where the practice focuses on issues in the market engaging the Internet, and among other aspects, e.g., computing, e-commerce, data protection, software development, application development, data management infrastructure, cloud processing, and cyber security, to mention as few.  As these terms arise in conversation, contracts reviewed or drafted, negotiations, legal advisory opinions, and in litigation, there is always the need to flesh out their meanings in the situation or business endeavor.  It is also frequently needed to describe their use and how they interplay in the aggregate of the whole system, for lack of a better word.  The word Internet remains the prevailing term to describe as parsimoniously as possible, the cyber realm in which all is being transmitted.  The following terms, are thrown around, in articles galore, assuming the readers’ common awareness. 

Our experience has been that the opposite is true. Many times, a client did not understand how their technology was stacked and was not aware of the difference between the client side of their system from the server side of their system.  The data gathered by clients becomes overwhelming and they are not aware of the available tools to cull, read, and manage the data productively.  They enquire about the cloud and how cloud computing could be an operative means for their business, but not appreciate the risks and the methods that can be applied.  So, for purposes of enhancing a general understanding going forward, this piece will touch briefly on frequent terms that will be helpful to businesses, entrepreneurs, startups, collaborators, and much more.  The following terms will be addressed in alphabetical order and not in rank of prevalence or importance.
Algorithm - Traditionally one thinks of math and formulas.  What is actually being employed are a set of rules that sets a procedure by which data is attended to in order to execute a process.  What that process is has no limits.  An algorithm can be used to cull data by designated criteria, i.e., meta.  It can be used for computer forensics involving the investigation for characters, words, images, phrases, and themes. An algorithm can be a set of rules that encrypt messages so that they are readable unless one has the key to encode the message. Calculations are possible for astronauts, missile delivery systems, orientation of satellites for defense or for communications, or for even to be able to watch an MLB game. 

Application Programming Interface - A simply way of describing how API is discussed with clients when working on contracts and terms of use agreements for their clients’ project is to address API as a form and means of communication among software that is being used.  The communication between software in and through devices may be via data patterns, certain criteria or variables, or queued calls.  Depending on the contract and work to be done by the client for their client, there could be a computer program that is to be developed that itself will be based on building blocks devised by the programmer using an API.  API may be for multiple purposes.  An API can be for managing a library of software to work together, for managing database or operating a system, and it can be functioned via the web.  The important aspect is understanding that there are instructions on how data is exchanged from software to software and how that data result is made usable. 

Artificial Intelligence - People normally think of a robot when AI is mentioned.  But it is not the physical appearance of an operation that should trigger attention.  Attention is warranted on the hidden aspect which is the process that drives AI, i.e., rules, algorithms, software, etc.  The rule that are utilized together create a process where either question can be answered, devices are made to begin operation, or even medical diagnosis can be drawn.  AI can learn from itself as inputs of results are noted and assessed form continuous improvement.  AI is just not all about robots, but could as well regard bots.  Bots can be terms the soft side of AI.  One can say that AI is a composition of processes driven by knowledge that can be continuously learned beyond what was initially programmed. The challenge for startups and established business and even for government institutions, is to how to learn the use of AI, and use it ethically.

Many uses are before us and many do not acknowledge AI’s existence.  Consider the following examples of AI in our interactions: electronic financial transactions; job search and queries and the matching results; journalism using story telling agents, such as the Washington Post Heliograf; Pinterest enhancing the recognition of images and improved searches; investment funds indexing, to mention a few. The key to keep in mind is that AI can learn from itself.

Big Data - The term is used to describe multiple sourced data accumulation and multiple purpose use data.  The former construes the immense obligation to manage what is received and the latter construes the ethics and means of how to use the data.  The data can be telling of many things such as customer choices, consumer trends, investment trends, allocation of taxes, misallocation of revenues, frequency of transactions, trends in transactions, inventory adjustments, supply chain trends among vendors, employee computer use, employee attendance and productivity, national grading trends, and the list of uses continues on; it seems endless.  What also makes it “big” is the multiple sources for the data.  An individual’s everyday life creates data.  Data is created by mobile use, web searches, buying gasoline or coffee, checking out a book at the local library, social media communications, forum memberships, newsletter interactions, and again, the list of sources continues.

The immensity of this data gave rise to the birth of Data as a Service well gave birth to “DaaS,” where the administrative handling of the data is outsourced due to cost constraints.  Big data is assessed by it velocity (accumulation), volume (amount), value (usefulness), and variety (interrelated and not stratified).  The volume is from captured data as it increases with web traffic and network processes.  The variety is by its very nature diverse.  Its value depends on relevance to the entity’s function and mission.  The velocity aspects is by the consequence of integrated recording of activities.  Data clients receiving the data run criteria for the use of the company that uses the data either for sales, product improvements, diagnostics, predictions, markets assessment, inventory allocations, etc.
Blockchain - Transactions and activity exchanges need to be recorded in order to be verified, measured, and accounted.  What blockchain does is that it records transactions between participants, client and business, members of a group, etc. it can be programmed to set transactions which can be used for automatic payments that are then reconciled in near real time.  Royalties are compensated as intellectual property is used.  Such uses can also help in the development of business processes to enhance efficiencies.  By serving as a ledger, transactions are confirmed and it could have multiple benefits, from adjusting inventories across the globe to tracking monetary flows of investments.  There is also the security aspect of the blocks in general.  These blocks of data cannot be altered and serve as a resource of verification with a time stamp.  The blockchain technology has as well a link between blocks that confirms verification.  It is essentially a software for integrated transaction recording.  Blcokchain’s use is beneficial for records management, medical field, banking, investments, money transfers, policing identity screening, processing transactions (ATMs. EFTs, ACH), and internal revenue tracking. 


Cloud Computing (CC) - The aspect of CC is a remote access feature to records, emails, data, that is not stored digitally by the entity on its servers.  The computing through the cloud so-to-speak is by virtue of accessing the information from distance.  Particular software enable that feature that as well provides the service benefit via the web.  The cloud computing function uses software as a services SaaS where the client entity logs in on their account using the rented software to access their records, emails, etc.  The idea is that data, records, documents, are stored in a central or diversified place and not at the business or entity’s place of operation.

Clients, then shift from owning infrastructure to renting and sharing resources that inevitably shares the administrative aspects of storage.  By it allowing for remote access from anywhere via a web connection and access, the resources are shared, that is servers, storage, applications (API), algorithms, and hardware.  The purpose is to minimize costs based on usage and based on a feature similar to the delivery of telecommunication systems, where the network is shared. 

Interface - On many occasions when discussions regard a service issue, the interface comes up to determine where there could have been a quality issue.  A user has experience with a head page that displays in a formatted feature the program to operate.  The application then is interfaced for the user to access the program.  What we tell clients is that is it the bridge between the user and the program.  So the user interface is about the instructions used to access and operate the application.  When the client issue regards multiple applications or devices, we discuss then the interface that facilitates the communication of coded instructions for the operating system and or the devices.  It is essentially the means of integration. 
Internet of ThingsClient are seeking to streamline their processes and enquire of the liabilities of IoT, but more so, they wonder about the process.  What takes place is convergence of systems by use of the Internet.  The technology that administers data interfaces with the operational technology whereby data is transmitted to either function the device, record data driven by the device or driven, using the device.  The IoT functions by recognizing the Internet Protocol of the device that then transmits the date through the network acknowledging the source of the data.  This is typical of heart monitors issuing a reading, alarm system controls receiving a call for monitoring update while away from home, wearable fit watch transmitting distance on a run, or cyber security detection of an intrusion attempt.  The IoT involves constant connection, transmission, and collection of data about uses, functions, and performance.  Where a client may be concerned with measuring a certain process, i.e., rate of fermentation on a new beer recipe, the sensors can transmit the data needed to measure.

Malware –  The occurrence of intrusions increasingly use malware where this type of software is coded to find vulnerabilities in a target system or network.  Gaining access is the initial goal of the malware, which then heightens the importance of personnel training to recognize suspicious emails and links.  There are a variety of malware types and they are commonly known as spyware to monitor and steal information, keylogger to trace character strokes on the keyboard, viruses to cease computing operations, worms seeking information inherent in the system, or ransomware seeking extort money by seizure of the computer operation.  The carrier of the malware could deceptively be embedded in advertising (adware).  Some malware attacks the operation system of the computer or network making the victim take certain predictable steps or forced steps that open the victim up to more harm.  The controversial issue for client is if they had proper cyber liability insurance that would cover the cyber incident.

Machine Learning – Our startup clients engaging in handling complex projects as a service, i.e., modeling, programming, email filtering, etc., utilize statistical data in a variety of ways.  The projects involve computational tasks that engender creating productive information for the intended recipient client.  Small business from collaboration centers may offer as a service analytics using machine learning that yield study results, model training, algorithms, diagnostics, predictions, and even security enhancing features.  Machine learning is tasks driven to study a data set or a compilation of entries and catalog them by instructed variables.  A client’s process using machine learning in their service provided could provide ways to filter emails for advertising, rank consumer choices, stratify demographics by region, or even make predictions on investments, interest rates, climate change, and effects of policies.
Open Source – In the growing sharing economy relying on collaboration, open source has been very beneficial.  The unseen source code can be shared and utilized to recreate an improved software. OS is a series of devised software that has made it mainstream to foster innovation and serve as a backbone for other generations of processes.  Once receiving a license to use an open source software, the user can modify the OS.  The key is that there will be the restriction, that while a modified OS is shared or made available to others, the source code must also be shared.  OS providers are focused on transparency and free exchange of code.  The transparency aspect allows for adaptations to the software code for others to inspect and maybe they can improve on it. By allowing access to the source code,

Technology Stack -  The combination of software components that addresses the operating system (OS), web server (Apache), database handling (MySQL), and the server coding environment (PHP), presents a typical stack.  In operations, contracts address the clients side of the service and the server side.  The Tech Stack is a categorization of the software that will be employed to run both the back-end and the front-end. It is the stack that comprises the architecture of the system for the business, entity, government agency, you name it. The backbone of the entity’s functioning mission or what is called business rules operate in the back-end of the system.  Consumers, customers, and the like, access the front-end by a web browser, or if the individual is access using a mobile, then the access is through an app interface.  While a system has its Tech Stack comprising it system software, applications are stacked apart.  As cloud issues arise with clients, servers and their programs become an issue regarding database programs and the software support they will receive within the stack components. 
Originally published www.lorenzolawfirm.com March 8, 2017
Lorenzo Law Firm is “Working to Protect your Business, Ideas, and Property on the Web." Copyright 2017, all rights reserved Lorenzo Law Firm, P.A.