Show That the Space C R of All Continuous Functions Defined on the Real Line
Show that the vector space of all continuous real-valued functions is infinite-dimensional
Solution 1
Consider the subspace of $C(\Bbb R)$ whose vectors are the polynomials. This subspace has the following basis: $$\langle1,x,x^2,x^3,x^4,\dots\rangle$$ Each element is linearly independent of all the others. To show this, suppose there exists a linear combination that evaluates to zero everywhere: $$\sum_ia_ix^i=0$$ The left-hand side is a polynomial with an infinite number of roots. Any non-zero polynomial, however, must have a finite number of roots (no more than its degree). Therefore the left-hand side is the zero polynomial, i.e. $a_i=0$ for all indices $i$.
Since there are an infinite number of elements in the basis, the subspace and thus $C(\Bbb R)$ are infinite-dimensional.
Solution 2
One can manufacture linearly independent functions by detecting linear independence through evaluation. Concretely, suppose you have a family of functions $f_i : \mathbb R\to\mathbb R$. Any linear dependence relation $$\sum \lambda_i f_i=0$$ gives upon evaluation at some $t\in\mathbb R$ a linear dependence relation in $\mathbb R$ $$\sum \lambda_i f_i(t)=0$$ If we choose $t$ appropriately we can obtain information about the $\lambda_i$ and hopefully show each $\lambda_i=0$.
Even more concretely, consider the following families of functions
- A sequence $f_0,f_1,\ldots$ such that $f_i$ is zero outside the interval $[i,i+1]$ and takes a nonzero value inside such interval,
- A sequence $f_0,f_1,\ldots$ such that $f_i(j)=0$ if $j>i$ but $f_i(i)=1$.
- A sequence $f_0,f_1,\ldots$ such that $f_i$ has $(f_i)^{(i)}=0$ but $(f_i)^{(i-1)}\neq 0$.
I claim that, without knowing exactly what those functions are, I can prove any of those collections are linearly independent. Indeed consider a linear dependence relation as above.
- If we evaluate such equality in some $z \in (j,j+1)$ where $f_j(z)\neq 0$ then all the terms $f_i(z)$ are zero except that with $i=j$, so we obtain $\lambda_jf_j(z)=0$, and because $f_j(z)\neq 0$, $\lambda_j=0$. Hence this collection is linearly independent.
- For the second, again consider a linear dependence relation. By the condition, if we evaluate the relation at $t=0$ then the only surviving term is that with $f_0$, and we obtain $\lambda_0=0$. Now evaluate at $t=1$, and obtain $\lambda_1=0$. Because in the relation already finitely many terms are possibly nonzero, you obtain that every $\lambda_i=0$. More generally, nonzero functions with disjoint supports are always linearly independent.
- For the last example, consider such linear dependence relation and (for the sake of contradiction) the highest $j$ such that $\lambda_j\neq 0$. If you differentiate such equation $j$ times, all the $f_i$ involving $i<j$ vanish by our condition, and there is $z$ such that $f_j^{(j)}(z)\neq 0$, so we obtain $\lambda_j f_j^{(j)}(z)=0$, contradicting that $\lambda_j$ was nonzero.
I invite you to produce such infinite families (that is, perhaps give explicit continuous functions that have such properties) and try to come up with more conditions that "detect linear independence upon evaluation."
Solution 3
Just show that the subspace of polynomials, $P_\infty$, is infinite dimensional. Suppose, by contradiction that it is finite dimensional. Then, $P_\infty$ is equipped with a finite basis. Say the vectors are $p_1, \cdots, p_n$. Define $k = \max \{\deg{(p_1)}, \cdots, \deg{(p_n)} \}$. Then, $x^{k + 1} \not \in \text{span}\{p_1, \cdots, p_n\}$, but $x^{k+1} \in P_\infty$, a contradiction.
Solution 4
It suffices to show that for all $n$ there is a surjective linear map to $\mathbb R^n$.
The function $f \mapsto (f(x_1), f(x_2), \dots, f(x_n))$, for pretty much any pairwise distinct $x_i$, should suffice. Prove that is linear, and that it has every element of the standard basis in its image.
Solution 5
(1). Let $(a_n)_{n\in \mathbb N}$ be a strictly increasing real sequence, converging to $A.$ Suppose that for each $n$ there exists continuous $f_n:\mathbb R\to \mathbb R$ such that $(x<A\land f_n(x)=0)\iff x\in \{a_j:j\geq n+1\}.$ Then $S=\{f_n:n\in \mathbb N\}$ is an infinite linearly independent set.
Because if $T$ is a non-empty finite subset of $S,$ let $n_T$ be the largest $n$ such that $f_n\in T.$ So $f_{n_T}\ne f\in T\implies f(a_{n_T})=0,$ while $f_{n_T}(a_{n_T})\ne 0. $ So if $\{r_f:f\in T\}$ is a set of non-zero numbers we have $$\forall x\;(\sum_{f\in T}r_f f(x)=0)\implies \sum_{f\in T}r_ff(a_{n_T})=0\implies r_{n_T}f_{n_T}(a_{n_T})=0\implies r_{n_T}=0,$$ a contradiction to $r_{n_T}\ne 0.$
(2). To obtain such $S$: Let $g :[a_2,A)\to \mathbb R$ be continuous such that for $n\in \mathbb N$ we have $ [a_{n+1},a_{n+2}]$ with $g(a_{n+1})=0=g(a_{n+2})$ and $0\ne|g(x)| \leq 2^{-n}$ for $x\in (a_{n+1},a_{n+2}).$
For example let $g(x)=2^{-n}\frac {(x-a_{n+1})(x-a_{n+2})}{(a_{n+2}-a_{n+1})^2}$ for $ x\in [a_{n+1},a_{n+2}].$
Now let $f_n(x)=x-x_{n+1}$ for $x<x_{n+1},$ and $f_n(x)=x-A$ for $x\geq A,$ and $f_n(x)=g(x)$ for $x\in [a_{n+1}, A).$
(...It is easy to show that each $f_n(x)$ is continuous at $x=A$, using $f_n(A)=0$ and $\sup \{|g(x)|:x\in [a_{j+1},a_{j+2}\}\leq 2^{-j}$ for all $j$.)
This proof also applies to the set of continuous $f:J\to \mathbb R$ for any interval $J\subset \mathbb R$ of non-zero length, as we may take $\{A\}\cup \{a_n:n\in \mathbb N\}\subset J.$
Related videos on Youtube
Comments
-
Show that the vector space $C(\Bbb R)$ of all continuous functions defined on the real line is infinite-dimensional.
I get that if $C(\Bbb R)$ contains an infinite-dimensional subspace, then it is infinite-dimensional, but how do I prove that? Obviously $\Bbb R$ is infinite…
-
Or the span of $f_n(x) = \max(0, 1-|x-2n|)$.
-
-
Well, you have to prove such infinite set is linearly independent. The set $\{x,2x,3x,\ldots\}$ is also infinite but linearly dependent.
-
@PedroTamaroff this works?
-
Perhaps I was a bit unfair. Polynomials are more or less by definition an $\mathbb R$-vector space with basis the powers of $x$.
-
You can use the Fundamental Theorem of Algebra to rigorously justify that this set is linearly independent.
-
@GyuEunLee this works?
-
@ParclyTaxel Yup, it's all good now.
-
"i.e. $a_i = 0$" - you are assuming that the constant $0$ function has a unique expansion $(a_0,a_1, \dots)$. How do you know this without first assuming that the $x_i$ are linearly independent? Also, I cannot understand your bit about infinite roots and whatnot. The $0$ (i.e. additive identity) in $C(\mathbb R)$ is already the $0$ function-- you don't need to prove this.
-
@ParclyTaxel - I suppose my problem comes down to two things: $1)$ I had forgotten that it is only necessary to check for finite summations of elements (your notation makes it ambiguous as to whether the sum is finite or not; but it's my fault really) and $2)$ although obvious, you have not shown that the only way to express the $0$ function with finite number of terms is with all coefficients equal to zero. I suppose this is minor enough to redact the downvote though.
-
...but for whatever reason, SE won't let me until your answer is edited.
-
@Myridium Answer edited.
-
Do you mean (for the first family of functions) "such that $f_i$ is zero outside the interval"?
-
Yes, I do, @psmears.
-
Not the Fundamental Theorem of Algebra which states that non-constant complex polynomials have at least one root. What you use is a basic (but nameless) theorem in commutative algebra that says that over an integral domain (in particular over any field) a nonzero polynomial can have no more roots than its degree.
-
Why can we still call $\sum_{i} a_i x^i$ a polynomial? I thought polynomials are only allowed to have finitely many terms. When index set is countable infinity from $0$ to $\infty$, this power series can have infinite zeros (consider the power series expansion of $\sin(x)$. Am I missing what you are saying?
-
@nekodesu (three years later): I think the sum is intended to have finitely many terms. We only need to consider finite linear combinations for linear independence. (Indeed, a general vector space doesn't allow infinite sums of vectors: usually you need a topology, or some other operations, for that).
Recents
wayandonellove1988.blogspot.com
Source: https://9to5science.com/show-that-the-vector-space-of-all-continuous-real-valued-functions-is-infinite-dimensional
0 Response to "Show That the Space C R of All Continuous Functions Defined on the Real Line"
Post a Comment