https://www.deepl.com/zh/translator
https://pulipulichen.github.io/HTML5-Wrapped-Text-Formatter/
https://www.deepl.com/zh/translator
https://pulipulichen.github.io/HTML5-Wrapped-Text-Formatter/
Every person has different
lives, the thing you want to reach somehow you couldn't reach, this kind of
moment is usually can be seen.
Although you fulfill the
goal that you have dreamed for a long time, at the time you reached it, you'd
find out the dream you dreamed for was somewhat different than you thought.
It can
be applied anywhere, for me, I usually like to apply it in my relationship
issues.
Although we didn't
experience what others have been through, in turn, they did not
experience the things that we've been through either.
The most important thing in
your whole life is to love yourself because the only person who can accompany you
till the end of life is yourself, your parents, friends, boyfriend, etc., they
all may leave you at any time and any minute, so as a person you can see until
you die, we definitely should love ourselves, no matter in physical or mental
ways, we all should do so.
I knew it's hard to love
ourselves but we should try our best to do it. If we don't like ourselves and
we desire others to love us, it doesn't make sense, and what if others leave
you alone, it would devastate you.
We were born alone and died
alone, no matter what kind of the ending you get, I would say it's a good
ending because living a life as a human is so hard, so you already try your
best, we are all the best.
Written by Weibert有錢小崴少
上面數來第二個公式是用sum square error(誤差平方和) 計算W和H的定位誤差,
為何要開根號的原因可以看下面的圖~當bounding box較小的時候, W和H也較小 (上面的那一個),
當它的W和正確的Ground truth(標註的真實值) 差距和較大的bounding box一樣時,
並不代表小的bounding box和大的bounding box 的IOU值一樣,
因為你由圖片可以看到大的bounding box其實和Ground truth(標註的真實值) 重複的範圍比較多,
所以你若用IOU去看,
就會發現大的bounding box的IOU值, 會比小的bounding box的IOU大~
假設Ground truth(標註的真實值) 的ŵ是 0.4,
大的bounding box 是 0.6,
小的bounding box 是 0.2,
公式 : (√w-√ŵ)^2
💛那麼若沒加根號會長這樣唷~
大的 (0.6-0.4)^2=0.04
小的 (0.2-0.4)^2=0.04
💛加了根號會長這樣~
大的 (√0.6-√0.4)^2=0.02
小的 (√0.2-√0.4)^2=0.034
這時你就會發現小的Loss值比較大,
而大的Loss值比較小,
所以若沒加根號會對大的bounding box比較不公平,
明明IOU比較大, sum square error(誤差平方和) 卻和較小的bounding box一樣,
因此我們必須加上更號,
來讓誤差平方和有差距~
💜重點結論💜
假設用w寬度來看,
√w是bounding box的預測寬值, √ŵ是Ground truth(標註的真實值)寬值,
大的bounding box的 (√w-√ŵ)^2 會小於 小的bounding box的(√w-√ŵ)^2
💗------------卷積神經網路探討------------💗
💚Kernel的張數
控制張數主要就是控制學習的參數量,沒有一定幾張就是對的,
常見是16、32或64,
如果我們使用16張Kernels就能達到很好的效果,
那也就不需要浪費額外的參數去增加學習與計算量。
💚Kernel大小
Kernel大小與其Receptive field有關,
Receptive field直觀來說就是Kernel提取資訊的尺度,
要是我們預偵測的主體在整張圖像裡的比例很大,
結果我們都選用很小的Kernel來卷積,那效果可能就不是這麼好,
同理,要是主體很小卻都用很大的Kernel來卷積,想必也不是這麼妥當。
現在普遍流行的方式是選用不同大小的Kernel對圖像做卷積後,
再把輸出的Feature maps合併或平均。
常見的Kernel大小有1*1, 3*3, 5*5, 7*7。然而也有人提出,
兩層3*3的Kernel 卷積與一層的5*5Kernel 卷積擁有相近的Receptive field,
並且使用較少的參數量,因此大家不妨去嘗試看看不同組合的效果。
小知識:為什麼Kernel大小都是奇數呢?其實偶數也可以,但基數Kernel有幾個先天上的優勢,第一個原因是由於基數的卷積核有中心點,較容易對齊確認位置資訊,再者是因為基數的卷積核能確保Padding的對稱性(下文會詳細說明)。
引用自:
深度學習:CNN原理
https://cinnamonaitaiwan.medium.com/%E6%B7%B1%E5%BA%A6%E5%AD%B8%E7%BF%92-cnn%E5%8E%9F%E7%90%86-keras%E5%AF%A6%E7%8F%BE-432fd9ea4935
( 圖片來源 : 卷積神經網路(Convolutional neural network, CNN): 1×1卷積計算在做什麼 ) |
輸出的圖像就會越厚 ( 專業的講法是這圖像的深度變得更深了 ),
相對的參數也會越多。
( 圖片來源 : 卷積神經網路(Convolutional neural network, CNN): 1×1卷積計算在做什麼 ) |
參考自:
卷積神經網路 (Convolutional Neural , CNN)
https://hackmd.io/@allen108108/rkn-oVGA4