In Lecture: Documents: Scalar Value Types, he keeps referring to Strings as “Scalar” types. But how does that make any sense? A scalar should be able to scale something. I can multiply a number by 5, but I can’t multiply a number by “apple”.
@Aaron_Franke_59358 I think its one of those overloaded / interchangeable terms tbh despite the various definitions. I’ve always considered primitives to be thing like int, char, float, etc… and to an extent String. But in some definitions I’ve seen Scalar used to describe something that is a single value regardless of the data type…
You have a misconception of what scalar means. It is not related to something that scales or can be multiplied. There’s a really good answer on stack overflow about Scalar vs Primitive that I think is worth checking out if you find scalar value types confusing.