Need to do some "experienced people advantage taking of" here im afraid.

I started with just the C language and decided (for a while) that that would be enough and that the other quota of my time would be hardware based. but since having learnt c i feel as if there simply is not point in staying with the one of course, because i slowly grasped an understanding of how the numerous languages have very crucial basis for why they should continue to be used and interfaced with each other according to the nature of user's intent.

and besides the is already a huge number of the uber - high-adam -shits -his -pants -seeing- code -of level programmers of each language, so i figured if me learning these things ought to someday be useful to other people i will maximize my this level of usefulness by becoming as multi-prograsmming lingual as i can, seeing i seem to be able to learn a bit quicker each time i bite the bullet and google a new language name, or ide.

i have set myself for the next 12 months a "c 301" and "python 102" , "java 101" and "c++ 101" as subjects for myself in the computer science categories of my study and since i am only beginning these was hoping to get peoples opinion as to how much time i should commit to the study period for a)java b)c++ c) perl over the next 1-2 years. ( im 31 i must eventually go back to some stupid job :-( ) so i need to appropriate my time wisely from now on>

Comments

  • edited May 2015

    IMO, as programmers we need to know at least 1 "online" & 1 "offline" language.
    For the "online" option, JavaScript is a must. And for offline there are many other options.
    I think for now you should stay w/ Java in general, including Processing of course.
    Later on you can decide if more languages can further enrich your experience! B-)

  • im 31 i must eventually go back to some stupid job

    Then pick the languages that give you the best job prospects. GoToLoop's suggestion is valid. For the offline language I think the choice is between C++ or Java.

    One thing you can be certain of is that things change rapidly in computing and that includes programming languages.

  • ok so even when ive caught up i need to keep up is what u r trying to say :-P

  • edited May 2015

    we yep it was the trouble with getting android mode reliable that seems to be putting alot of my weighting on java, i mean i will always keep going with pure math and physics but its pretty damn addictive being able to essentially emulate anything you want virtually, i mean i know i prefer virtual call of duty to say if australia brings back conscription and says ok now try "really life more bleedy" cod :-P and the same goes with the capacity for virtualization of any process which might not work out so great for you in experimental physics, so i got to the extent i could remote view which was the initial goal of learning programming, (which is a piece of cake now o.c with processing,team viewer, half a dozen web cams and a few unos and a relay board controlling robotic arms the cameras,IR sensors,ultrasound sensors can be mounted on :-P) but i am pretty hooked on the instant positive feedback you get motivationally from writing code, and add the mind fuck of hardware software interfacing so i have to make room for it thats for sure theres no throwing in the towel with it thats for sure!!

  • ah so I should go c++ or java? the combination of both will be of no benefit?

    Im just trying to improve my efficiency rate by learning languages in an order that will be complementary to one another hence making the learning curve more involved and faster, best of both worlds hehe.

    I was liking the simplistic tution style at the c++institute.com course, but I know java is fundamental to getting processing 2.2.1 to work to its full potential for me, so ill have to leave c++ until after java if that's the case, since my ability to learn things using the processing 2.2.1 is a clear winner because it sort of 'hints' to you the things being done for you.

    I am FINALLY starting to grip the programmers definitions of class, field, type, im still shaking on things like I don't understand the exact process that's happening for a lot of the data conversions that are precompiled into one liners ie str() int() although I use them SO much, so that's probably got to be on the list.

    I actually wrote out a "course structure" for my self in meanwhile as they took some convincing to let me come back to uni (its obviously a fairly feared outcome there, like that guy that comes into the video store that they have a photo of at the counter and sign saying do not admit entry)

    anyway if people are reading and are nice enough to add one liners to the list below ,with a percentage rating of how stupid they thought it was that I hadn't listed their addition already, much appreciated.

    btw im not actually keeping track of how long I have been going since I started computer science on a compartmentalize basis,( I just put the 101 behind everything in the list to annoy myself into learning more)

    Web based programming 101 -.html -.xml -.css (basic known)

    java script java involvement? .php (basic known)

    Networking 101

    WINDOWS SHELL COMMAND LINE FUCTIONALITY( AND ITS IMPLEMENTATION WITH P 2.2.1 WRITTEN APPS 101) OR, THE SHORT HAND NAME, MS-DOS :-P (I just remembered I learnt a lot of these commands when I was a kid, but it was short lived because I did something to the computer and it angered the ruler of the house immensely which is usually not good)

    Hardware Specification and Prototype implementation\Software for Interfacing of:

    USB Wireless Modem DTMF Android Intel Serial RGB Digital Photometry

    (add to this list please!!!)

  • edited May 2015

    @Chrisir, right now Swift is for Apple only & it's not open either! :-@

  • yes, but the iOS-market is big enough and it's a modern language.

    Believe me there are a lot of developers out there that are working only for iOS.

    ;-)

  • ugh are you serious? can i get a mac desktop system image etc and get the ropes of it at the same time as im learning about how to use virtualization tech for andriod? oh i have missed a vital one in the list, linux commands.

    well i tried find the smart phone equivalent to a PC's cmd.exe on google play and what popped up named "terminal" is command line based but the first few things i tried were not accepted but echo hello world works so it must be linux based.

    so much to learn im so angry at myself for not realising how much fun this stuff is because i was learning physics and math a "deeper" knowledge than comp science yet now addin comp science has exponentially increased my productivity in those areas of study so lol at my younger self

  • edited May 2015

    Hello ! You should try OpenFrameworks or OpenFl . Both are runnable on IOS & Android.

    OpenFl is based on Haxe, and Haxe is a meta-language inspired by AS3. http://haxe.io/videos/noc/processing-like-setup/

    http://www.openfl.org/

  • It can be pretty overwhelming when you see all of the potential options available to you, especially when you compare yourself to people who are experts at one option or another. I've been programming for over a decade and I feel like I'm learning more now than I ever have before.

    But think about programming languages (and libraries within those languages) as tools. Think of Java as a hammer, C++ as a screwdriver, JavaScript as a wrench, etc. Different jobs require different tools, and oftentimes a large job will require multiple tools.

    As programmers, generally we develop some kind of familiarity with a bunch of different tools over time. Learning what a hammer does (and doesn't do) can help you better understand when to use a saw, for example.

    So, instead of focusing on the tools, I would say that you should focus on the projects instead. What kinds of projects do you want to be involved in? Working on robotics? Designing websites? Developing applications used by businesses? Mobile applications? Games? Art?

    Once you decide on a project (and it doesn't have to be a final decision, you can do a different project later), choosing which tools to learn about becomes much easier. And eventually, you'll have a portfolio of projects that you can show to people, which goes a long way when looking for a job.

  • edited May 2015

    thanks kevin thats actually probably at least one of anyway, best response ive got on forums thus far.

    well i generally seperate myself down the middle in regard time allocation to project by name, and when i hit brick walls there i generally say to myself its time to get on the net read about what i dont know, then get back to the project, hopefully by repeating this cycle until my compter science has caught up to my physics and math, i will reach an employable level of productivity in R&D and i suppose if a group that advertises for a position in the technical spread of the kind of projects ive been working on (hence why i go open book approach for comp science and programming, i simply dont have the expected life span left to try and be a hero and entirely self teach this side of things like math and phy, as some one said if i had of started this with the aim of making top of the pack programming wise, well its just a giant lol for ppl that have say a decade heads start on it while i was focusing else where, im simply looking to maximize my understanding to the point i can speak multi-liguistically programming wise, and provide the software that complements or is implicitly required by the prototype hardware for what ever insanity i am building electronically for that particular project, i would say there are about a dozen, (they all seem to progress at the same time as my programming and digital electronic studies progress, taunting me about not seeing the spread of applicability of computers outside of maple for ten years )

    i have several ideas that as you guessed do involve robotics, and automation and( im feeling quezy and lame saying machine learning? ) when its not what i mean, its what i would refer to as "precognitive image processing capacity" in that the conceptual idea in my head as to how i would go about achieving this from my current level of understanding is that this cpu is pre-loaded with various assumptions(based on prior data mining conducted in prior device operational run-time) about the pixel array expected through a particular port prior to receiving any "actual" data from the digital camera(/arbitrary imaging sensor of some spectral range) and i could spam forever like a blister in the sun about this subject of interest ive developed yet i have no idea what the actual mainstream programming community call it, falls somewhere under image processing, anyway that subject, what ever you call it, is a biggy for me. in reference to 2.2.1 libraries, blob detection i think its called, or rendering of a 3 dimensional virtual from the lighting values in four component pixel array data (i think the alpha value for each pixel data point (r,b,g) no? (eat a dick oxford dictionaries so damn expensive)

    I have taken to using notepad++ with processing, i will take a screen capture of both of my desktops so if there is something someone spots that's going to slow me down for the next 12 months.

    I am also, (as many of you guys/girls achieved when you were 10 :-P) working to becoming competent in custom computer prototyping,( im probably that hobo that goes through your garbage when its roadside white goods and electrical waste collection week) so i'm hoping that as i learn java it will lead me to a better understanding of how a CPU operates on a machine code level, as in i know java uses binary, and it compiles the apps that are run on virtual devices, so somewhere in between all of that i'm hoping a coincidental at least, lesson about machine code will pop up.

  • thanks for the link also tlecoz!

  • edited May 2015

    in regard to the software, the first processing lib i plan to make will be an open source hardware prototyping aid does for your circuit design a similar guessing game notepad++ plays for your code if im being understood? (btw for a pretty amazing range of differing file extensions it does notepad++, hence why it opens up with the main tools i am partial to 2.2.1, arduino, cmd.exe,on that note i was just remembering back .bat file creation was actually one of the very first things i learnt when i was a kid say 13 i think and i remember being fascinated by it then as i am now, for the 12 months prior to me being banished from the ah computer access for what ever spam spam spam, yep that will be the first thing i output of any value to anyone lol.

  • edited May 2015

    oh and re the link, yep, my ego has already been devestated by the existence of daniel shiffman, havent started reading the nature of code yet but i do know he is responsible for like rough estimate at least 65 percent of the libraries and functions ive studied programming wise from within the processing 2.21 ide, so yes the proof is in the sheer magnitude of the quantity of user functionality his code enables in the final application you export and well in terms of success you can only measure education quality by the rate in time of the learning curve, so the way processing is a good starting point hence alot of the means daniel shiffman explains in his code commentary is why im predicting it to become big in that regard, i mean im not a fast learner by any means so i know there has to be some kind of effectiveness there thats due to the design of the ide and the main group of writers intent when they mapped out its structure conceptually.

  • so no pde attachments allowed for posts on this forum?

Sign In or Register to comment.