Some basic concepts of CNN

Finally, I want to introduce the Convolutional Neural Network (CNN). The reason for this is that CNN involves a lot of mathematical concepts, which can make it quite cumbersome to implement from scratch using only Numpy. However, if you just want to understand its intuitive ideas and implement it in a simple way, the TensorFlow framework makes CNN much easier and more straightforward. I plan to explain how to integrate CNN into our previously implemented neural network framework in a chapter without any asterisks. At the same time, I will also provide a chapter with an asterisk, where I'll walk through implementing a CNN from scratch using only Numpy. This way, readers can choose to skip the more complex part if they prefer a lighter read — enjoy a relaxed and happy mood with the non-asterisked section (σ’ω’σ). This chapter focuses on some fundamental concepts of CNN. First, I want to clarify that structurally, a basic CNN is not very different from a standard neural network (though more complex CNNs may have significant differences). This means that a simple CNN contains and only contains two main components: 1. A hierarchical layer structure 2. A network structure that integrates these layers Therefore, when implementing the algorithm, we are essentially discussing the corresponding parts of the previously implemented neural network. After understanding the structure, it's important to look at the core thinking behind CNN. Generally, it can be summarized into two key points: **local connection (sparse connectivity)** and **weight sharing**. These ideas are quite intuitive. For example, when we look at a scene, we don’t process it all at once; instead, we observe it piece by piece, receiving information "in parts" (what we call the "local receptive field"). During this process, our thoughts remain relatively consistent, and after observing the whole scene, we might form a judgment like “this scenery is beautiful,” and then adjust our perception accordingly. In this analogy, each "piece" of the scene represents a local connection, and our thoughts represent the weights. When we look at the scene, we apply the same set of thoughts across different parts — this is the biological meaning of weight sharing. (Note: This analogy was created by me, and I can't guarantee its academic rigor. I encourage the audience to approach it critically. If experts find my explanation too simplistic, I hope you’ll still smile with (σ’ω’σ).) The textual explanation might still feel a bit abstract. To help visualize, I’ve drawn a diagram (based on a reference image, though it was somewhat misleading, so I decided to draw my own version, even if it’s not perfect): ![CNN vs. NN](http://i.bosscdn.com/blog/1R/G6/11/57-0.jpg) This image highlights the difference in how NN and CNN process input. On the left, it shows a fully connected neural network, where every neuron connects to all inputs — this is the global receptive field. Each neuron is independent, making it hard to extract meaningful "vision" from raw data. On the right, CNN uses a local receptive field combined with shared weights. The neurons are grouped into "blocks" of view, and each block shares the same weights (represented by the thick green lines), which scan over different parts of the original input. Next, I’ll discuss how to implement this idea — specifically, how to implement the convolution operation in CNN. The detailed math will be covered in the math series, but here is the core code (again, thanks to TensorFlow!): ```python def _conv(self, x, w): return tf.nn.conv2d(x, w, strides=[self._stride] * 4, padding=self._pad_flag) def _activate(self, x, w, bias, predict): res = self._conv(x, w) + bias return layer._activate(self, res, predict) ``` This involves concepts that will be explained later, but if you understand the basic ideas, the code should still be fairly readable.

52W Medical Adapter

52W Medical Adapter,52W Medical Outlet Adapter,52W Medical Ivd Adapter,52W Breast Pump Adapter

Shenzhen Longxc Power Supply Co., Ltd , https://www.longxcpower.com