"It doesn't have to be in the bible if you wish to live outside of his Word."
Where does it say that in the Bible? Where is the concept that all you need is the Bible for your only source of faith and morals actually in the Bible? It ought to be there and it ought to be clearly stated.
Is this an external biblical tradition you have?