Securonix User and Entity Behavior Analytics

| January 25, 2019

article image
Today’s cyber threats are more sophisticated, executed on a larger scale, and have the ability to spread rapidly. For example, in 2017 WannaCry infected 45,000 systems across 74 countries within 24 hours. Traditional correlation-based security monitoring tools are not capable of detecting advanced threats like these because they lack the ability to scale, lack a broader context, and have weak analytic capabilities.

Spotlight

Rexus Group

Rexus is a global IT Infrastructure Consulting and services company, offering IT Infrastructure services support for Presales Bid Management, Project Management, Technical writing, partner certification, Technical Advance Services, etc. The USP Rexus provides technology & vendor agnostic solutions. For instance, Partners can avail our Presales & Bid Mgmt. or our Technical support service and get support on any solutions e.g. Microsoft, Cisco, Avaya, Collaboration, Contact Center, Security, Wireless, IBM AIX, HP Unix, Big Data, Storage, Virtualization, Contracts, OpenStack, IT Services, etc. through our multi skilled pool of Subject Matter Experts (SME). This negates the need to invest time, efforts, finance for on-boarding multi skilled SME’s during resource crunch.

OTHER ARTICLES

Evolution of capabilities of Data Platforms & data ecosystem

Article | October 27, 2020

Data Platforms and frameworks have been constantly evolving. At some point of time; we are excited by Hadoop (well for almost 10 years); followed by Snowflake or as I say Snowflake Blizzard (who managed to launch biggest IPO win historically) and the Google (Google solves problems and serves use cases in a way that few companies can match). The end of the data warehouse Once upon a time, life was simple; or at least, the basic approach to Business Intelligence was fairly easy to describe… A process of collecting information from systems, building a repository of consistent data, and bolting on one or more reporting and visualisation tools which presented information to users. Data used to be managed in expensive, slow, inaccessible SQL data warehouses. SQL systems were notorious for their lack of scalability. Their demise is coming from a few technological advances. One of these is the ubiquitous, and growing, Hadoop. On April 1, 2006, Apache Hadoop was unleashed upon Silicon Valley. Inspired by Google, Hadoop’s primary purpose was to improve the flexibility and scalability of data processing by splitting the process into smaller functions that run on commodity hardware. Hadoop’s intent was to replace enterprise data warehouses based on SQL. Unfortunately, a technology used by Google may not be the best solution for everyone else. It’s not that others are incompetent: Google solves problems and serves use cases in a way that few companies can match. Google has been running massive-scale applications such as its eponymous search engine, YouTube and the Ads platform. The technologies and infrastructure that make the geographically distributed offerings perform at scale are what make various components of Google Cloud Platform enterprise ready and well-featured. Google has shown leadership in developing innovations that have been made available to the open-source community and are being used extensively by other public cloud vendors and Gartner clients. Examples of these include the Kubernetes container management framework, TensorFlow machine learning platform and the Apache Beam data processing programming model. GCP also uses open-source offerings in its cloud while treating third-party data and analytics providers as first-class citizens on its cloud and providing unified billing for its customers. The examples of the latter include DataStax, Redis Labs, InfluxData, MongoDB, Elastic, Neo4j and Confluent. Silicon Valley tried to make Hadoop work. The technology was extremely complicated and nearly impossible to use efficiently. Hadoop’s lack of speed was compounded by its focus on unstructured data — you had to be a “flip-flop wearing” data scientist to truly make use of it. Unstructured datasets are very difficult to query and analyze without deep knowledge of computer science. At one point, Gartner estimated that 70% of Hadoop deployments would not achieve the goal of cost savings and revenue growth, mainly due to insufficient skills and technical integration difficulties. And seventy percent seems like an understatement. Data storage through the years: from GFS to Snowflake or Snowflake blizzard Developing in parallel with Hadoop’s journey was that of Marcin Zukowski — co-founder and CEO of Vectorwise. Marcin took the data warehouse in another direction, to the world of advanced vector processing. Despite being almost unheard of among the general public, Snowflake was actually founded back in 2012. Firstly, Snowflake is not a consumer tech firm like Netflix or Uber. It's business-to-business only, which may explain its high valuation – enterprise companies are often seen as a more "stable" investment. In short, Snowflake helps businesses manage data that's stored on the cloud. The firm's motto is "mobilising the world's data", because it allows big companies to make better use of their vast data stores. Marcin and his teammates rethought the data warehouse by leveraging the elasticity of the public cloud in an unexpected way: separating storage and compute. Their message was this: don’t pay for a data warehouse you don’t need. Only pay for the storage you need, and add capacity as you go. This is considered one of Snowflake’s key innovations: separating storage (where the data is held) from computing (the act of querying). By offering this service before Google, Amazon, and Microsoft had equivalent products of their own, Snowflake was able to attract customers, and build market share in the data warehousing space. Naming the company after a discredited database concept was very brave. For those of us not in the details of the Snowflake schema, it is a logical arrangement of tables in a multidimensional database such that the entity-relationship diagram resembles a snowflake shape. … When it is completely normalized along all the dimension tables, the resultant structure resembles a snowflake with the fact table in the middle. Needless to say, the “snowflake” schema is as far from Hadoop’s design philosophy as technically possible. While Silicon Valley was headed toward a dead end, Snowflake captured an entire cloud data market.

Read More

The case for hybrid artificial intelligence

Article | March 4, 2020

Deep learning, the main innovation that has renewed interest in artificial intelligence in the past years, has helped solve many critical problems in computer vision, natural language processing, and speech recognition. However, as the deep learning matures and moves from hype peak to its trough of disillusionment, it is becoming clear that it is missing some fundamental components.

Read More

A BRAND NEW CHIP DESIGN WILL DRIVE AI DEVELOPMENT

Article | February 20, 2020

The world is now heading into the Fourth Industrial Revolution, as Professor Klaus Schwab, Founder and Executive Chairman of the World Economic Forum, described it in 2016. Artificial Intelligence (AI) is a key driver in this revolution and with it, machine learning is critical. But critical to the whole process is the need to process a tremendous amount of data which in turns boosts the demand for computing power exponentially.A study by OpenAI suggested that the computing power required for AI training surged by more than 300,000 times between 2012 and 2018. This represents a doubling of computing power every three months and two weeks; a number that is significantly quicker than Moore’s Law which has traditionally measured the time it takes to double computing power. Conventional methodology is no longer enough for such significant leaps, and we desperately need a different computing architecture to stay ahead in the game.

Read More

Data Analytics the Force Behind the IoT Evolution

Article | April 3, 2020

Primarily,the IoT stack is going beyond merely ingesting data to data analytics and management, with a focus on real-time analysis and autonomous AI capacities. Enterprises are finding more advanced ways to apply IoT for better and more profitable outcomes. IoT platforms have evolved to use standard open-source protocols and components. Now enterprises are primarily focusing on resolving business problems such as predictive maintenance or usage of smart devices to streamline business operations.Platforms focus on similar things, but early attempts at the creation of highly discrete solutions around specific use cases in place of broad platforms, have been successful. That means more vendors offer more choices for customers, to broaden the chances for success. Clearly, IoT platforms actually sit at the heart of value creation in the IoT.

Read More

Spotlight

Rexus Group

Rexus is a global IT Infrastructure Consulting and services company, offering IT Infrastructure services support for Presales Bid Management, Project Management, Technical writing, partner certification, Technical Advance Services, etc. The USP Rexus provides technology & vendor agnostic solutions. For instance, Partners can avail our Presales & Bid Mgmt. or our Technical support service and get support on any solutions e.g. Microsoft, Cisco, Avaya, Collaboration, Contact Center, Security, Wireless, IBM AIX, HP Unix, Big Data, Storage, Virtualization, Contracts, OpenStack, IT Services, etc. through our multi skilled pool of Subject Matter Experts (SME). This negates the need to invest time, efforts, finance for on-boarding multi skilled SME’s during resource crunch.

Events