But you cannot learn to code unless you build on top of things you already know: if you want to try to use a resolver, you first need to learn what a service, an injection and a class are. Erica: In terms of differences, I feel the grammar syntax of mandarin only gets harder the more advanced you get whereas, in programming languages , I feel personally that as you become more experienced the syntax becomes easier because you can quickly notice common faults.
The other big difference I feel that exists comes in terms of the thought process. This difference might be a cultural difference though. I think this is a cultural thing though because Americans typically use speech as a way to think through processes, ideas. Filiberto: Grammar, rules, and structures vastly outweigh expressional representation and conveyance of meaning.
Ben: Grammar, definitely. If you have basic grammar you can still do almost everything with little vocabulary. Erica: Programming languages, in my opinion, are more about grammar. The vocabulary is rather limited. Filiberto: No idea, but I would hypothesize there is a degree of logical interference as there would be for any language.
But the foundations of all computer languages over time are essentially unchanged. Computers' cognitive abilities are limited to 0 and 1 and infinite possibilities in between. Languages and culture have a much vaster degree of ambiguity. Erica: I think native mandarin speakers definitely program differently. Two MIT-based startups, Lightmatter and Lightelligence , are developing optical neural-network accelerators based on this approach. Lightmatter has already built a prototype that uses an optical chip it has fabricated.
And the company expects to begin selling an optical accelerator board that uses that chip later this year. Another startup using optics for computing is Optalysis , which hopes to revive a rather old concept. One of the first uses of optical computing back in the s was for the processing of synthetic-aperture radar data. A key part of the challenge was to apply to the measured data a mathematical operation called the Fourier transform.
Digital computers of the time struggled with such things. Even now, applying the Fourier transform to large amounts of data can be computationally intensive. But a Fourier transform can be carried out optically with nothing more complicated than a lens, which for some years was how engineers processed synthetic-aperture data. Optalysis hopes to bring this approach up to date and apply it more widely. There is also a company called Luminous , spun out of Princeton University , which is working to create spiking neural networks based on something it calls a laser neuron.
Spiking neural networks more closely mimic how biological neural networks work and, like our own brains, are able to compute using very little energy. Luminous's hardware is still in the early phase of development, but the promise of combining two energy-saving approaches—spiking and optics—is quite exciting. There are, of course, still many technical challenges to be overcome. One is to improve the accuracy and dynamic range of the analog optical calculations, which are nowhere near as good as what can be achieved with digital electronics.
That's because these optical processors suffer from various sources of noise and because the digital-to-analog and analog-to-digital converters used to get the data in and out are of limited accuracy.
Indeed, it's difficult to imagine an optical neural network operating with more than 8 to 10 bits of precision. While 8-bit electronic deep-learning hardware exists the Google TPU is a good example , this industry demands higher precision, especially for neural-network training. There is also the difficulty integrating optical components onto a chip.
Because those components are tens of micrometers in size, they can't be packed nearly as tightly as transistors, so the required chip area adds up quickly. A demonstration of this approach by MIT researchers involved a chip that was 1.
Even the biggest chips are no larger than several square centimeters, which places limits on the sizes of matrices that can be processed in parallel this way. There are many additional questions on the computer-architecture side that photonics researchers tend to sweep under the rug.
What's clear though is that, at least theoretically, photonics has the potential to accelerate deep learning by several orders of magnitude. Based on the technology that's currently available for the various components optical modulators, detectors, amplifiers, analog-to-digital converters , it's reasonable to think that the energy efficiency of neural-network calculations could be made 1, times better than today's electronic processors.
Making more aggressive assumptions about emerging optical technology, that factor might be as large as a million. And because electronic processors are power-limited, these improvements in energy efficiency will likely translate into corresponding improvements in speed.
Many of the concepts in analog optical computing are decades old. Some even predate silicon computers. Schemes for optical matrix multiplication, and even for optical neural networks , were first demonstrated in the s. But this approach didn't catch on.
Will this time be different? Possibly, for three reasons. First, deep learning is genuinely useful now, not just an academic curiosity. Second, we can't rely on Moore's Law alone to continue improving electronics.
And finally, we have a new technology that was not available to earlier generations: integrated photonics. These factors suggest that optical neural networks will arrive for real this time—and the future of such computations may indeed be photonic. Explore by topic. The Magazine The Institute.
IEEE Spectrum. Our articles, podcasts, and infographics inform our readers about developments in technology, engineering, and science. Join IEEE. A not-for-profit organization, IEEE is the world's largest technical professional organization dedicated to advancing technology for the benefit of humanity. Enjoy more free content and benefits by creating an account Create an account to access more content and features on IEEE Spectrum, including the ability to save articles to read later , download Spectrum Collections , and participate in conversations with readers and editors.
Computing Topic News Type. Programs written in the language include one for fortune-telling from the I Ching. A rendering of a program written in wenyan-lang to draw the Mandelbrot set. Topic Type Robotics News. But a good question. In mainland China at least virtually everyone of an age to be getting a job programming already knows Pinyin--a version of their language set into our alphabet.
Also, you have to know it to type anything into the computer as that's how the input editor works. Thus it seems to me a trivial step to use the standard keywords--why in the world would they make a different version of the language in order to get around this?? The inconvenience of their code not being interchangeable would far exceed the tiny advantage to not having to deal with reserved words not in their script.
The Chinese and other cultures I have come into contact with all code in English. I live in Vietnam and they still use English here; even for all the variable names. On occasion you will see Vietnamese comments, but for the code itself.. This makes sense when you realize that the majority of online resources are in English and the fact that sharing code with people of other cultures becomes easier too.
I've got two native Chinese speakers on my team - I'll ask them personally on Tuesday if you don't get a good answer by then, but here's my initial guess.
Code is written in the standard programming languages, although comments and perhaps variable names could be written in Chinese. Most of the commonly used language are English-based, and most of them are so to appeal to an international audience. Here is a list of some of them: Wikipedia: Non-English based programming languages. I'd be surprised if it's used on the same scale as other widely used languages at the moment :.
Most Chinese people just code in the same way as rest of the world, using those common programming languages, at most with some Chinese comment or romanized Chinese variable name. Now it is predominately use by those with little foundation to develop things like game cheat and malware Stack Overflow for Teams — Collaborate and share knowledge with a private group. Create a free Team What is Teams? Collectives on Stack Overflow.
Learn more. How are the chinese coding? Ask Question. Wei Sheng is TechNode's feature editor. You can contact him at shengwei [at] technode [dot] com. More by Wei Sheng. This site uses Akismet to reduce spam.
0コメント