The dot product imprinted burliness, overprinter quasi-three, restore, see double streak-free.
这迹单弱、网点原好、套印准、、轻影。
声明:以上例句、词性分类均由互联网资源自动生成,部分未经过人工审核,其表达内容亦不代表本软件的观点;若发现问题,欢迎向我们指正。
Meaning our final output looks like that relevant dot product, but minus one.
意思是我们的最终输出看起来像那个相关的点积,但减去一。
Otherwise, that dot product would be 0 or negative, meaning the vector doesn't really align with that direction.
否则, 那个点积就会是0或负,这意味着这个向量并没有真正与那个方向对齐。
Electric flux equals the integral of the dot product of electric field and dA.
通量等于场与dA的点积的积分。
Electric flux equals the dot product of the electric field and the area, so let's use that.
通量等于场与面积的点乘,所以我们使用这个公式。
Notice, this looks like a dot product between two column vectors, [m1, m2], and [v1, v2].
注意,这看起来像是两个列向量 [m1, m2] 和 [v1, v2] 间的点积。
However, let's use E dA cosine theta instead of the dot product.
然而,我们使用E dA 余弦 theta 而不是点乘。
And if they point in generally the opposite direction, their dot product is negative.
如果它们通常指向相反的方向 它们的点积是负的。
Rather than using the dot product equation, let's use the electric flux equation without the dot product.
与其使用点乘方程,让我们使用不带点乘的通量方程。
So that wraps up dot products and cross products.
这就包含了点积和叉乘。
Similarly, its dot product with these other directions would tell you whether it represents the last name Jordan, or basketball.
类似地, 它与这些其他方向的点积会告诉你它代表的是姓乔丹,还是篮球。
For most linear transformations, the dot product before and after the transformation will be very different.
对于大多线性变换,变换前后的点积会非常不同。
And for simplicity, let's completely ignore the very reasonable question of what it might mean if that dot product was bigger than 1.
为了简化问题,我们完全忽略掉如果点积大于1可能意味着什么这一合的问题。
So when two vectors are generally pointing in the same direction, their dot product is positive.
所以当两个向量通常指向同一个方向时 它们的点积是正的。
Some of you might like think of this as a kind of dot product.
有些人可能会认为这类似于点积。
In fact, transformations which do preserve dot products are special enough to have their own name: Orthonormal transformations.
事实上,保持点积不变的变换特殊到有自己的名称:正交变换。
Now, this numerical operation of multiplying a one by two matrix by a vector feels just like taking the dot product of two vectors.
现在 这个1×2矩阵乘以一个向量的值运算就像取两个向量的点积。
But the operation as a whole is not just one dot product but many.
但整个操作不仅仅是单一的点积,而是多个点积。
Luckily, this computation has a really nice geometric interpretation to think about the dot product between two vectors V and W.
幸运的是 这个计算有一个很好的几何解释来考虑两个向量V和W间的点积。
When their perpendicular meaning, the projection of one onto the other, is the zero vector, their dot product is zero.
当它们垂直的意思 一个向量在另一个向量上的投影 是零向量时 它们的点积是零。
Looking at sides 2 and 4, we need to realize the dot product of B and ds, is the same as B ds cosine theta, Right!
观察边2和边4, 我们需要实现B和ds的点积,与B ds cosine theta相同, 对吧!
关注我们的微信
下载手机客户端
划词翻译
详细解释