vector space
a vector space
is a set of vectors such that two vectors can be added to form another vector in
, and a vector can be multiplied by a scalar to form another vector in
. this addition and multiplication must satisfy these rules:
- additive identity. there exists a vector
such that for any
, we have
.
- additive inverse. for any
, there exists a vector
such that
.
- commutative law for addition. for all
, we have
4. associative law for addition. For all
,
5. multiplicative identity. for all
, we have
.
- associative law for multiplication. for all scalars
and all
,
7. distributive law for scalar addition. for all scalars
and all
, we have
.
- distributive law for vector addition. for all scalars
and all
, we have
.
a broken link: blk:def-vector-space over
is called a real vector space.
[cite:;taken from @algebra_axler_2024 chapter 1 vector spaces; definition 1.22]
[cite:;taken from @algebra_axler_2024 chapter 1 vector spaces; definition 1.22]
a broken link: blk:def-vector-space over
is called a complex vector space.
[cite:;taken from @algebra_axler_2024 chapter 1 vector spaces; definition 1.22]
[cite:;taken from @algebra_axler_2024 chapter 1 vector spaces; definition 1.22]
- vector space intersection yields a vector space.
- vector space addition yields a vector space.
- vector space union doesnt necessarily yield a vector space.
a vector space is called finite-dimensional if it has a basis consisting of a finite number of vectors. the unique integer
such that every basis for
contains exactly
elements is called the dimension of
and is denoted by
a vector space that is not finite-dimensional is called infinite-dimensional.
[cite:;taken from @algebra_insel_2019 chapter 1 vector spaces]
[cite:;taken from @algebra_insel_2019 chapter 1 vector spaces]
some stuff from college
let be a non-empty set of vectors containing numbers from the field
, consider the 2 operations addition and scalar multiplication,
is a vector space only if it abides by the following axioms:
addition axioms:
- addition closure: for every
we have
- associative addition: for every
we have
- commutative addition:
- zero vector:
so that
- negative vector: for every
there exists
so that
- multiplication closure: for every
and
we have
- associative multiplication: for every
and
we have
- identity vector: for every
we have
- identity law: for every
we have
- first distributive law: for every
and
we can have
- second distributive law: for every
and
we can have
over some
and some field
:
the definition of summation would be:
and for some
the definition of multiplication would be:

this is an example of the 1st addition axiom
let
be a field
this describes all the polynomials of
over the field 
let
as an example
the addition of 2 polynomials is as follows:
the symbolic process of addition can be described as follows:
let
so there exist the polynomial degrees 
and the symbolic process of constant multiplication is defined as:
as for the degrees of the resulting polynomials after multiplication/addition:
if
:
if
:

let
this describes all the polynomials of
the addition of 2 polynomials is as follows:
let
if
a vector space could look something like this:
all the vectors that lie on the blue line represent a vector space, because the multiplication of a line would just make it longer (or shorter) it wouldnt make it move out of the blue line, and addition of any 2 vectors that lie on the blue line would also result in a longer (or shorter) vector that lies on the same line which expands across the entire 2d space
reduction law
for every


for every
for
:

for 


for
and 
