Why so little innovation in programming languages, Hemisemipumo?
| aromatic national | 06/14/12 | | Insane Factory Reset Button Public Bath | 06/14/12 | | aromatic national | 06/15/12 | | Cerebral Kitty Cat | 06/15/12 | | aromatic national | 06/15/12 | | Cerebral Kitty Cat | 06/15/12 | | Appetizing Foreskin | 06/15/12 | | Carnelian faggotry | 06/15/12 | | Godawful Turdskin Alpha | 06/15/12 | | Carnelian faggotry | 06/15/12 | | Cerebral Kitty Cat | 06/15/12 | | Carnelian faggotry | 06/15/12 | | aromatic national | 06/15/12 | | Concupiscible lime skinny woman travel guidebook | 06/15/12 | | Cerebral Kitty Cat | 06/15/12 | | Insane Factory Reset Button Public Bath | 06/15/12 | | Cerebral Kitty Cat | 06/15/12 | | Insane Factory Reset Button Public Bath | 06/15/12 | | Insane Factory Reset Button Public Bath | 06/15/12 | | Cerebral Kitty Cat | 06/15/12 | | Insane Factory Reset Button Public Bath | 06/15/12 | | violet goyim base | 06/15/12 | | Hateful chapel | 06/15/12 | | Godawful Turdskin Alpha | 06/15/12 | | violet goyim base | 06/15/12 | | crawly emerald depressive range | 06/15/12 | | Hateful chapel | 06/15/12 | | crawly emerald depressive range | 06/15/12 | | Hateful chapel | 06/15/12 | | aromatic national | 06/15/12 | | shimmering motley menage | 06/15/12 | | aromatic national | 06/15/12 | | shimmering motley menage | 06/15/12 | | aromatic national | 06/15/12 | | shimmering motley menage | 06/15/12 | | Insane Factory Reset Button Public Bath | 06/15/12 | | dull submissive center | 10/24/17 |
Poast new message in this thread
Date: June 14th, 2012 5:53 AM Author: aromatic national
Aside from Objective C for iPhones, nothing. Same languages for windows 95 minesweeper design multithreaded massive backed infrastructures of google today.
http://www.tiobe.com/content/paperinfo/tpci/images/tpci_trends.png
(http://www.autoadmit.com/thread.php?thread_id=1970439&forum_id=2#20885469) |
Date: June 15th, 2012 1:05 AM Author: Cerebral Kitty Cat
1) what do you even mean by this? It's kind of a vague and nonsensical question
2) your premise isn't even true if I make some assumptions about what you're saying. There have been enormous advances in programming languages. Ever heard of Java and the JVM? That was a HUGE deal because it allowed developers to write non-platform specific code.
3) innovations don't come at a superficial "code" level, but rather in compilation or in VM efficencies if that is how the code is executed. Redesigning code syntax or using a new language because "it's new" would be like redesigning a hammer or something. C has been used for 30 years because it still works really well in a lot of situations.
(http://www.autoadmit.com/thread.php?thread_id=1970439&forum_id=2#20890978) |
|
Date: June 15th, 2012 1:22 AM Author: Carnelian faggotry
the setup of today's computer architecture requires coders to issue specific commands or build a specific framework for each task that their program carries out. this is insanely tedious.
it's possible to imagine a much more "genetic" architecture which would allow for "self-developing" code which can carry out tasks by itself.
for example, think about database programming. you have to define EVERY GODDAMN THING about the nature of the storage and retrieval of your data.
but imagine a system of computer coding and architecture which would allow you to basically say, "computer, i am going to hurl data at you, and it's up to you to sort and store it in the most efficient manner you can find. change things up if need be."
(http://www.autoadmit.com/thread.php?thread_id=1970439&forum_id=2#20891026) |
|
Date: June 15th, 2012 1:30 AM Author: Carnelian faggotry
it goes well beyond that. i'm talking about something more like the computer being able to use and even define its own heuristics in order to achieve specific outcomes.
for example, think about malware. there is no way to give the computer a set of general guidelines regarding things that shouldn't be able to happen to it and would thus indicate a malware attack, which the computer would then automatically reject.
there's no way to code a computer so that - even though you never "mention" (via code) something like a buffer overflow attack - the machine would "recognize" at a general level that certain events were improper all on its own, and would not allow them to proceed.
(http://www.autoadmit.com/thread.php?thread_id=1970439&forum_id=2#20891052) |
|
Date: June 15th, 2012 4:46 AM Author: Cerebral Kitty Cat
do you mean the halting problem?
That is certainly a major, major issue in computing which has a profound effect on the way software is designed. I can see how resolving that issue would allow for major optimizations in hardware. Unfortunately, it is proven to be undecidable so it is just a mathematical fact that any machine designer has to work with.
I think the real issue is balancing hardware implementation and ease of software implementation (this goes to my point about why multithreading isn't a sustainable solution). Great pains have been taken to abstract away the user's experience from lower level machine implementation via multiple layers.
When we have a multicore CPU, only so much optimization can occur through automated means (eg, the kernel can automatically schedule new threads). More importantly, these automated performance gains are logarithmic. The only real way to truly take advantage of 16 cores is through innovation on the programmer's part. Humans are just bad at that.
So you can see, a major obstacle is creating a paradigm for a computing device which allows for the limitations of designers.
(http://www.autoadmit.com/thread.php?thread_id=1970439&forum_id=2#20891346) |
Date: June 15th, 2012 1:32 AM Author: violet goyim base
(non cs-degree holders pretending to be smart)
lol
(http://www.autoadmit.com/thread.php?thread_id=1970439&forum_id=2#20891059) |
Date: June 15th, 2012 2:28 PM Author: shimmering motley menage
you realize the language is just a collection of build blocks right?
if a language is turing complete, you can create any sort of program with it that you can create with any other turing complete language. so they are equally "powerful" from a mathetical POV (in fact, languages in the same class can be considered "equal" straight up, because you can map one to the other).
"innovations" in language are usually just adding syntactic sugar, and not at all necessary.
(http://www.autoadmit.com/thread.php?thread_id=1970439&forum_id=2#20893174) |
|
Date: June 15th, 2012 2:38 PM Author: shimmering motley menage
because the underlying mathematical principles don't change, as they are axiomatic. computer science is a mathematical science, not a physical science.
the very basic building blocks will always be the same, and that is machine code, or 0's and 1's. we have abstracted two layers above that now. the first layer is assembly, and the second layer being the current collection of "higher level programming languages".
it's possible that we will add more layers, see: functional programming as a more advanced paradigm of programming, but that's all we are doing.
(http://www.autoadmit.com/thread.php?thread_id=1970439&forum_id=2#20893216) |
|
|