Sunday, February 11, 2024

Software Quality Assurance

 

The Future of Software Quality Assurance

2019


Page count:257

Published:19 November 2019

Format:ebook

Publisher:Springer International Publishing

Language:English

Editor:Stephan Goericke

This open access book, published to mark the 15th anniversary of the International Software Quality Institute (iSQI), is intended to raise the profile of software testers and their profession. It gathers contributions by respected software testing experts in order to highlight the state of the art as well as future challenges and trends. In addition, it covers current and emerging technologies like test automation, DevOps, and artificial intelligence methodologies used for software testing, before taking a look into the future.

The contributing authors answer questions like: "How is the profession of tester currently changing? What should testers be prepared for in the years to come, and what skills will the next generation need? What opportunities are available for further training...

Source: Publisher

https://www.google.co.in/books/edition/The_Future_of_Software_Quality_Assurance/YRS_DwAAQBAJ


Tuesday, January 16, 2024

Software Bill of Material

 


https://www.sonatype.com/event/webinar-q1-2024-aws-dxc-webinar



Comprehensive SBOMs (Software Bills of Materials)

 


Contrast creates a comprehensive software bill of materials to meet regulatory and procurement requirements with support for both CycloneDX and SPDX. Contrast goes above and beyond the minimum SBOM standards set by NIST detailing critical security, versioning, environmental, and library usage information in its bill of materials.


The Contrast Secure Code Platform Approach to SBOMs

Contrast provides the fastest, easiest, and most scalable application security platform available. Our instrumentation-based approach to SBOMs has a lot of advantages and by leveraging our integrated solutions (Contrast Assess, Contrast SCA, and Contrast Protect), organizations can achieve the regulator goals set by President Biden administration and be prepared to address any specific mandate. 

Organizations today need to:

Automate SBOMs without running any scans

Continuously stay up-to-date

Deliver SBOMs to match complete apps/APIs, not fragments

Deliver SBOMs to include all libraries, including servers and runtime platforms, not just what's in code repo

Deliver SBOMs that include services, such as backend databases, directories, queues, APIs, and more

Deliver SBOMs that contain detailed vulnerability information

Deliver SBOMs that report on exactly which components are in use, and which are never loaded, never used.

Not provide SBOMs that don't include test libraries and other non-deployed components


https://www.contrastsecurity.com/sboms


Broad SBOM adoption takes root as businesses watch their supply chains

Research from Sonatype shows major companies are increasingly mandating outside vendors to account for the security of their applications.


Published Aug. 4, 2023

https://www.cybersecuritydive.com/news/sbom-adoption-businesses-supply-chains/690005/



Ilkka Turunen
Ilkka serves as Field CTO at Sonatype. He is a software engineer with a knack for rapid web-development and cloud computing and with technical experience on multiple levels of the XaaS cake. Ilkka is interested in anything and everything, always striving to learn any relevant skills that help towards building Sonatype for success.
https://blog.sonatype.com/author/ilkka-turunen/page/2


OSS is now critical infrastructure.
Ilkka Turunen

The report  states that Open Source Software components have not only become critical infrastructure for modern information systems, but also that a vast majority of organisations leverage it whether or not they know it, and that most software today is assembled, not constructed.

Software development has moved from an artisanal, soup-to-nuts process to one more akin to bricklaying”

As custodians of the largest open source ecosystem in the Java world, we at Sonatype have witnessed this transformation first hand. There were 87 billion download requests from this repository in 2017. A similar repository for JavaScript components is seeing 7 billion downloads a week, as reported in October 2018.

As the tools in the hands of developers across all programming ecosystems make it easier to leverage external code, the software engineering process itself has transformed itself to a process resembling manufacturing.














Monday, January 15, 2024

Introduction To The Basics Of Neural Networks - Reshared Knol



Author: Christian Eder,
Software-Engineer
Munich, Germany


The human brain is a complex structure consisting of billions of cells, the neurons. These neurons are connected by synapses. The neurons communicate with each other by sending electric pulses to their neighbouring neurons. Since the synapses can have different sizes, these electric pulses can be of varying power. The electricity floating through our brain makes us remember things, react on outside influences or makes our muscles move.

This already simplified image of our brain has been adopted by artificial neural networks. They are a kind of abstract, "mathematical" brain.

Neural networks could be formally described as a weighted graph G = (V, E). The set of vertices V can be seen as the neurons in the network. The edges E, that connect the vertices V are the synapses connecting the neurons. And the weights W of the edges describe the thickness or size of the synapses. A subset of vertices are called input neurons, and another subset of V is called output neurons. All vertices that are not part of the input or output subsets, are called "hidden" neurons.


The picture shows a simple, so called feedforward network - the neurons are only connected from one layer to the next. There exist more complex architectures for neural networks, e.g. with recurrent connections going from the hidden layer back to the input layer, but these networks are not discussed by this article, which wants to give an overview about how artificial neural networks work generally.

The electric pulses which are generated by the neurons in our brain are represented as numbers in neural networks - the higher the number, the higher the voltage.

The input layer of a neural network is where the electric pulses enter the network. In nature, an example is the set of neurons connected to our eyes through the optic nerve. The output layer can be seen as a set of neurons connected to actors like our legs - the eyes send an electric pulse to the input neurons, and our legs move because the output neurons tell them that there is a wall in front of us.

Artificial neural net works can be used to learn nonlinear functions - in other words, they can learn a mapping

where n is the number of input neurons and m the number of output neurons. Each of the neurons has an input and an output value. The values in an n-dimensional input vector are handed over to the input layer, and the output values of the i-th input layer neuron is set to the value in the i-th dimension of the input vector. The neurons in the hidden layer compute their input as the weighted sum of all ingoing connections from the input layer: they sum up all output values of the input neurons they are connected with, and multiply each of these values with the weight of the connecting synapse. After computing their input, the hidden layer neurons compute their output by applying an function to their input. The neurons in the output layer compute their input and output analog.

Formally, each of the neurons works as follows:

The weights of the synapses, combined with the structure of the net (number of neuons, number of neurons per layer, ...) determines the mapping that the neural network represents. Thus, learning a specific mapping means learning the correct weights of the synapses. There are several algorithms that can be used to accomplish this. One of them is called "error backpropagation".

When using error backpropagation, teaching the network works as follows:

An input vector is given to the net, and the correct output vector expected from the net is known. The actual output vector of the net is compared with the expected, and the difference vector between both defines the error vector. This error vector is set to the corresponding output neurons, and the error is propagated backwards through the net, in order to find out, how the synapse's weights would have to be changed to minimize the error. This gradient based approach needs the activation functions of all neurons to be differentiable. A concrete implementation of the backpropagation algorithm can be found on Wikipedia.

Source Knol: http://knol.google.com/k/introduction-to-the-basics-of-neural-networks#

Knol Nrao - 5192



Ud. 16.1.2024
Pub. 29.4.2012