Reviews and Comments

FractionalRadix Locked account

FractionalRadix@bookwyrm.social

Joined 1 year, 1 month ago

Software developer with a CS degree and an AI agree, both from the University of Amsterdam. This is the account where I keep track of my professional reading (as opposed to leisure reading).
I do NOT work with recruitment agencies, FULL STOP.

Mostly reading books from Manning publishers at the moment, but also re-reading some of the older stuff.

This link opens in a pop-up window

commented on Neural Networks: Algorithms, Applications, and Programming Techniques by James A. Freeman (Computation and Neural Systems Series)

James A. Freeman, David M. Skapura: Neural Networks: Algorithms, Applications, and Programming Techniques (Hardcover, 1991, Addison-Wesley) No rating

Chapter 5 discusses the Boltzmann network - yes, that's Boltzmann from the law of entropy.

In previous chapters, artificial neural networks were considered to have entropy and energy. The question then arises - shouldn't they also have temperature?

From this, a link is made to simulating the annealing process in metallurgy: heating up and then cooling down in a controlled process.
The Boltzmann network learns and retrieves this way: the simulated temperature is set to a high value, then slowly lowered, with a number of training/retrieval steps taken at each temperature.
The idea is that this controlled process will help us avoid local minima. There does not seem to be a guarantee of finding a global minimum, but we do improve our chances.
Notable about the Boltzmann network is that it is more about probability than about exact steps. The simulator code at the end of the chapter shows this: …

commented on Neural Networks: Algorithms, Applications, and Programming Techniques by James A. Freeman (Computation and Neural Systems Series)

James A. Freeman, David M. Skapura: Neural Networks: Algorithms, Applications, and Programming Techniques (Hardcover, 1991, Addison-Wesley) No rating

Chapter 4 discusses the Bidirectional Associative Memory (BAM) and the Hopfield memory, which is presented as a variation of the BAM. It is noted that it was probably not designed as a variation.
The BAM is different from the networks discussed earlier in several ways.
First, it does not have training; the weights are initialized to the desired values from the start. Re-training would imply calculating an entirely new weight matrix.
Second, its inputs and outputs are either +1 or -1, rather than a real number.
When applying a value to either layer of the BAM, changes to the nodes are propgated until the network stabilizes. The eventual values on the "input" and " output" layers are then one of the vector pairs stored in the BAM.
(Note that "input layer" and "output layer" is a bit of a misnomer here; the book rightly prefers vectors named x and y). …

commented on Neural Networks: Algorithms, Applications, and Programming Techniques by James A. Freeman (Computation and Neural Systems Series)

James A. Freeman, David M. Skapura: Neural Networks: Algorithms, Applications, and Programming Techniques (Hardcover, 1991, Addison-Wesley) No rating

Chapter 3 discusses the backpropagation network (BPN), including some applications. One example is image compression using a 3-layer network. This is a creative approach: the input and output for each sample are the same image, but the hidden layer is a quarter of the size of the input and output layer. Hence, the hidden layer is the compressed image. It would be interesting to see if the hidden layer performed a feature extraction. (My own thought here: we could perhaps apply a vector with only one node activated to the hidden layer, and see what "image" resulted).
Note that in this case, the weights from output-layer-to-hidden-layer should perform the inverse operation of the weights from input-layer-to-hidden-layer.

commented on Neural Networks: Algorithms, Applications, and Programming Techniques by James A. Freeman (Computation and Neural Systems Series)

James A. Freeman, David M. Skapura: Neural Networks: Algorithms, Applications, and Programming Techniques (Hardcover, 1991, Addison-Wesley) No rating

Chapter 2 has a discussion of the Madaline, which is basically any network of Adalines - but the learning algorithm needs to be changed for this. The Chapter closes with an implementation of the Adaline. Implementation of the Madaline is then left as a programming exercise.

Gavin M. Roy: RabbitMQ in Depth (2017, Manning Publications)

RabbitMQ explained in detail

RabbitMQ is a message broker, a piece of software that can be used to pass messages between systems - or between components of systems. By doing this via a message broker, we can accomplish loose coupling and asynchronous interaction. As an added bonus, we can see the messages passed between the systems in a uniform way, which helps diagnosis when an error occurs.

As the name suggests, the book does indeed go in depth. It explains not just how to use RabbitMQ, but goes so far as to explain the strucuture of the messages sent. It contains examples in Python, but one does not need to know Python to benefit from the book.

RabbitMQ implements the AMQP protocol; however, it also offers some functionality beyond that. The book points out what parts of RabbitMQ are AMQP, and what parts are extensions. This is useful if there's a chance you'll be …