Little information about repairing electronic cameras

Jerome Leaves

H
Jerome Leaves

  • 0
  • 0
  • 10
Jerome

H
Jerome

  • 0
  • 0
  • 16
Sedona Tree

H
Sedona Tree

  • 0
  • 0
  • 12
Sedona

H
Sedona

  • 0
  • 0
  • 13
Bell Rock

H
Bell Rock

  • 0
  • 0
  • 10

Recent Classifieds

Forum statistics

Threads
197,419
Messages
2,758,731
Members
99,493
Latest member
Leicaporter
Recent bookmarks
0

r_a_feldman

Member
Joined
Oct 22, 2009
Messages
159
Location
Chicago, IL
Format
Multi Format
Can floppy disk drives still be managed by current versions of Windows? Or did the manufacturers use other operating systems?

Maybe. Whether your computer can use floppies depends not on Windows, but on the computer BIOS. I have a 15-year old Dell Precision where the BIOS can handle 1.44MB 3.5 inch and 1.2MB 5.25 inch floppies, but not 360KB 5.25 inch (original IBM format) floppies. The computer has Windows 10, but I also run Windows XP and MS-DOS 6 under VirtualBox, but the BIOS is the same, so no luck with 360KB.

The version of Windows might be important if the software program you are trying to run is 16-bit or 32 bit and the OS is 64-bit. That will create problems. In that case, run a VM with the old OS.

Edit: If the software is on 1.44MB 3.5 inch floppies, you can get a USB floppy drive that gets around any BIOS issues. Then you still must deal with 64-Bit and other OS issues.
 
Last edited:

Neofito

Member
Joined
Jul 31, 2023
Messages
56
Location
Berlin
Format
35mm
I think this is the key to not giving up when you have problems with the electronics of the Minolta X cameras. Because ICs can only be unsoldered as spare parts from abandoned Minoltas, which is not easy due to their size and neighboring components. The soldering is then added.

I am not an electronic engineer, but theoretically a microcontroller could be read and dump that information in a file that can be later be recorded in flasheables microcontrollers. I am sure that this kind of inverse engineer doesnt need crazy tools for micro-controllers from the 80s like the ones on canon ae-1.
I think untill now there were planty of cameras on the market and cheap, with no so many faulty microcontrollers and no technicians with enough knowledge. But as time goes on, the first electronic cameras will become more precious and soon or later people with the required knowledge will enter the repair community.

I see very likely that in 10 years we will be flashing microcontrollers on minoltas, canon ae1 and so on. Probably also some crazy youtuber showing all this process instead of the ones we know today that very nice show us how to fix a leica III shutter.
 

koraks

Moderator
Moderator
Joined
Nov 29, 2018
Messages
20,581
Location
Europe
Format
Multi Format
theoretically a microcontroller could be read and dump that information in a file that can be later be recorded in flasheables microcontrollers.

It seems that easy, doesn't it.

Maybe, one day. But currently, microcontroller technology doesn't work in this "record & play back" kind of way. Yes, you could train a neural network to produce the same outputs on the same inputs as some chip in an AE1 - theoretically. But theoretically, a lot is possible that in practice is kind of challenging to do.

I guess you could get somewhere if you:
* Pick a simple camera model; i.e. the first generation that had any chips in them at all.
* Pick a model for which you have the complete manufacturer service manual so the entire system is specified in detail, including signal conditions and requirements.
* Have a solid basis in embedded systems design.
* Have a couple hundred to a few thousand hours to spare.

Given a population of 7+ billion, you might say that the above will eventually materialize, and that's also the only reason why I don't dismiss the idea out of hand.

This is not to brag, but to clarify why I'm a bit skeptical. I have a decent amateur working knowledge of microcontrollers and have done many projects involving the full process from requirements identification, electrical engineering, manufacturing, embedded software engineering etc. One thing that's always exceedingly difficult is trying to 'hack into' existing systems, although I've done so on occasion (i.e. reverse engineering a very simple Chinese remote controlled relay array because I really liked how the remote handled). The general consensus if you ask around among electronics enthusiasts is to just gut the device, pick out the components you want to re-use and start over. The equivalent in camera terms would be to leave critical components like light meter, solenoids/magnets, LCD's and buttons in place and then start building your own control hardware to tie it all together. The complexity of such a project far exceeds the realistic value of it; seriously, just chuck the damn thing in the garbage and wait for Pentax to launch their new models!

One of the easiest projects I did (not really a 'project') was the addition of a small microcontroller to a Sigma lens with a Canon mount that refused to work on modern Canon EOS cameras: https://tinker.koraks.nl/photography/potato-potato-making-an-old-sigma-lens-work-on-every-eos-body/ This 'project' ended up consisting of not much more than recreating the work of someone else, but in doing so, I also retraced their steps in reverse engineering the lens mount protocol and analyzing the software they ended up writing. Because the work was already done, this cost me just a few hours of desk research. Because the Canon EOS system uses a bog standard SPI interface, this saved the original inventor countless hours of reverse engineering the lens/camera communication. Because all that's needed in this hack is to literally flip one single bit in one particular bit sequence, it's something that can realistically be done.

What you're proposing is orders of magnitude more complex than the tiny example above. Scale up the flipping of a single bit on a single communication line on a single type of camera to basically substituting the entire controls of a complex camera system... It's not impossible. It's just a little more difficult than I suspect you imagine it to be.
 
OP
OP
Andreas Thaler

Andreas Thaler

Subscriber
Joined
Nov 19, 2017
Messages
4,300
Location
Vienna/Austria
Format
35mm
It's a bodged job compared to the immaculate work you're doing on cameras! What I did there is quite ugly - but it works.

Don't forget that I am a trained PR professional. So who knows what I'm really doing at the kitchen table at home! Maybe it's all just an illusion! 🥳🙃

No, seriously, I wish I had some of that knowledge of digital electronics as you do.

This digital world requires absolutely disciplined thinking; there is no scope like with analog electronics, where it usually doesn't have to be 100 %.

How do you see the introduction of ICs in camera electronics? This made fascinating new functions possible. But no one could understand how the circuit worked anymore?
 
OP
OP
Andreas Thaler

Andreas Thaler

Subscriber
Joined
Nov 19, 2017
Messages
4,300
Location
Vienna/Austria
Format
35mm
Maybe. Whether your computer can use floppies depends not on Windows, but on the computer BIOS. I have a 15-year old Dell Precision where the BIOS can handle 1.44MB 3.5 inch and 1.2MB 5.25 inch floppies, but not 360KB 5.25 inch (original IBM format) floppies. The computer has Windows 10, but I also run Windows XP and MS-DOS 6 under VirtualBox, but the BIOS is the same, so no luck with 360KB.

The version of Windows might be important if the software program you are trying to run is 16-bit or 32 bit and the OS is 64-bit. That will create problems. In that case, run a VM with the old OS.

Edit: If the software is on 1.44MB 3.5 inch floppies, you can get a USB floppy drive that gets around any BIOS issues. Then you still must deal with 64-Bit and other OS issues.

Thank you, that's not all that easy in the digital world 😌
 

koraks

Moderator
Moderator
Joined
Nov 29, 2018
Messages
20,581
Location
Europe
Format
Multi Format
How do you see the introduction of ICs in camera electronics? This made fascinating new functions possible. But no one could understand how the circuit worked anymore?

Well, that's an interesting remark and I think it was a dramatic event for the camera industry and the people working in it.

To give a parallel from office automation, in which my father was active starting in the 1970s. To give a tiny bit of background: he used to work for banks, publishers and supermarket chains and in such organizations introduced the first generation of computers (so mainframes) for management reporting and business control. Some of these organizations already had made steps in the direction of automation, using Hollerith machines. Others relied on manual calculations and administrative systems.

Either way, when the time came for these organizations to transition to actual digital systems, it proved difficult and sometimes impossible for existing staff to transition to the new technology. The intuitive and man-readable nature of manual administration systems and to an extent Hollerith cards had no parallel in mainframe automation, with its inherent characteristics such as digital encoding and the distinction between data and code. Especially the latter proved difficult and many of the older staff could simply not grasp how the same machine could be programmed to do different tasks without altering its hardware.

I imagine that a similar divide emerged in the camera industry in the 1970s, with the old guard being invested in mechanical solutions, while a newer generation started to solve the same engineering problems with electronics, IC's and rudimentary microcontrollers. Surely, there would have been staff who didn't quite understand the new way of doing things, and as a result were probably distrustful about it. Since a camera still relies on a couple of inherently mechanical operations, there would have remained opportunities for at least some of these people to keep on - as long as they could interface well enough with the rest of the system (both the camera as well as the organization developing it!) So I expect it would have been a landmark event, with massive implications for organizational processes, the workforce, and all the indirect effects on the company and its environment.

The notion that no-one would understand it anymore I think is not entirely correct, but I do think that part of the old guard would have felt that way. I also think that this probably gave rise to a new generation of managers and systems engineers with a different educational background (e.g. electrical engineers instead of mechanical ones) in key positions in the firm. They would have brought different management paradigms to such R&D projects, for sure. While thinking in modular terms is already recognizable in mechanical equipment, fully mechanical cameras remain very tightly integrated functionally and spatially. With electronically controlled cameras, you see a clear development in the direction of modularization, although it's constrained (or hidden) a bit by spatial limitations. But I do expect that this was one of the main enables towards the explosion of functionality in cameras of the late 1980s and 1990s. An electrical design makes it a little easier to draw a boundary around a subsystem and define interfaces for it, that can then be more easily realized in a physical implementation. A parallel in the semiconductor industry is the concept of VLSI (Very Large Scale Integration) that relies in a similar logic of 'divide and conquer'.

It's easy to underestimate how fundamental such a transition is, and how remarkable it is if a company survives it. At the same time, it also creates new opportunities. An exponent of this is the rise of a company like Sony in the camera business. Especially with actual image recording becoming a digital affair, Sony quickly realized that the difference between a minidisc player and a camera is not all that big, conceptually speaking. And sure enough, they built a successful business out of it.

Well, there's a lot that can be said about it. The above is really a very quick & very dirty response and it would take a couple of books to work it out in any reasonable depth.
 
OP
OP
Andreas Thaler

Andreas Thaler

Subscriber
Joined
Nov 19, 2017
Messages
4,300
Location
Vienna/Austria
Format
35mm
Thank you! @koraks

Just one more question:

How are ICs - let's stick to our topic of camera electronics - developed?

Who creates the circuit and who oversees in detail the cooperation of countless transistors, diodes and resistors that are invisibly applied to the chip?

How is that all organized?
 

koraks

Moderator
Moderator
Joined
Nov 29, 2018
Messages
20,581
Location
Europe
Format
Multi Format
Ah, I'd have to write a book to answer that, and I wouldn't be the best person to do it, either.

It depends on the kind of IC. There are massive differences between memory, processing, logic, analog/mixed signal, etc. IC's. The differences are mostly in the degree of repetitiveness of the design and the engineering competencies required in it - which can extend from digital to analog to RF etc. etc. Another major factor is the degree of complexity.

Something as simple as an opamp might be manageable (in theory) by a single person in terms of design, but the kind of IC you'd find in a late 1980s camera system is already complex enough that you're probably looking at teams of dozens of people.

Then there's the issue of which design stage you want to look at. This stretches from conceptual and architectural design (basically, how are we going to plot the requirements onto which building blocks, and how would that more or less map onto the space, power and cost budgets we have) to high-level building block definition, to detailed electrical engineering, and then the translation from theoretical discrete parts into actual physical implementation (i.e. what kind of semiconductor technology will we use to create these transistors, what kind of dimensions do they need to have etc.), and ultimately the process of bringing the on-paper design through prototyping stages to a manufacturing reality.

From this, you can glean that it's really a team effort and there's no clear-cut answer to the question how it's done. It really depends on the nature of the technology, but also very much on what kind of organizational and technical context the work occurs in. People at NXP will probably solve the problems in a different way than at e.g. TI due to a variety of reasons. These reasons also include to what extent actual semicon manufacturing is done in-house, whether the company specializes in IC design or if it's something they do as part of a much broader set of tasks etc. The latter is mostly a moot point for industry at large, because there's now a high degree of specialization and this means that OEM's tend to purchase whatever semiconductors they need off the shelf. However, certain areas such as indeed camera systems remain special a bit in the degree of specialization, and this is accompanied with a large degree of vertical integration (esp. in Japan) or at least very tight and long-term collaborations that resemble vertical integration (rest of the world, basically).

Again, it takes volumes to answer that.
 
OP
OP
Andreas Thaler

Andreas Thaler

Subscriber
Joined
Nov 19, 2017
Messages
4,300
Location
Vienna/Austria
Format
35mm
Thank you again! @koraks

Then I don't have to feel guilty that I failed if I can't explain how the X-700's circuitry works ☺️

For me, this is a fascinating topic and motivation to expand my digital knowledge.

I am a fan of older specialist literature, as the explanations there are often very well-founded and understandable. Accordingly, I stocked up on reading material, mainly on the subject of electronics, over the winter 🤠

1.jpg


My wife doesn't say anything more about it.

Recently I was able to purchase a large number of copies of the SPT Journal on eBay and a few copies of The Camera Craftsman and The New Camera Craftsman that the owner of a camera repair shop was selling.

I'm very happy because the SPT journals are rare and there are relatively few issues available on Learn Camera Repair.

2.jpg
 
Last edited:

koraks

Moderator
Moderator
Joined
Nov 29, 2018
Messages
20,581
Location
Europe
Format
Multi Format
Then I don't have to feel guilty that I failed if I can't explain how the X-700's circuitry works

Certainly not!! It's super complex.

I am a fan of older specialist literature

Ah, yes. Well, the thing is that while some of the core concepts discussed the kind of books you have there (i.e. Wolf, Kleemann) are still valid, contemporary practices in microelectronics are really different from what's taught in those books. Yes, you can still purchase plenty of 74-series chips, but frankly, who uses them...they might see service at a very limited scale as a line driver here and there. I used a 74HC14 the other day because I needed to invert a couple of PWM signals, but that was the first or second time I resorted to something like this in my own projects. And only because I happened to have these, as there are plenty of other solutions to the same problem.

And virtually all of it is now done with microcontrollers, of course. Even the simplest things. I reverse engineered a heating mat the other day - just a simple pad that gets hot if you throw a hardware switch. How hard can it be? Sure enough, there's a little 8-bit microcontroller in there that drives the gate of a triac and that handles the 90-minute auto-turnoff delay and runs the indicator LED that shows active duty - oh, and it constantly monitors the temperature of the pad, too, and thus controls it through PID and protects against overheating. Back in the days of your books, this would all be solved with a handful of TTL chips. But the single 16-pin probably 8051-architecture derived controller is far cheaper and more flexible to use. And it's a lot easier to implement as well for today's engineers. Why spend hours leafing through a catalog of logic gates and schmitt triggers if you have all of that on board by default in the bog-standard $0.20 microcontroller, including an array of timers, interrupts, memory, ADC's...

The thing is, that so much has changed that those books barely help in understanding current practices. They are also very limited in understanding the electronics in the kind of cameras you're working on, because those had to resort to higher levels of integration at an early stage, and would have explored the use of microcontrollers at a very early stage as well. Hobby-oriented literature of the day mostly stepped over those issues because it was undocumented and mostly not relevant for hobby projects anyway.

They're still nice books, but I find them mostly nice because of the nostalgia associated with them...they remind me of a world that seemed much simpler. Which is really a fallacy, but it still feels nice, hah!
 
Last edited:
OP
OP
Andreas Thaler

Andreas Thaler

Subscriber
Joined
Nov 19, 2017
Messages
4,300
Location
Vienna/Austria
Format
35mm
You're right, digital technology has evolved further and this is no longer the current state.

I completed two correspondence courses, one in analog electronics and one in digital electronics.

Digital, well, I disappointed my teacher because I didn't want to do the work with the Arduino. As soon as I started programming, I was unhappy. That doesn't suit me at all.

This means I have no connection to the present. But for my area of interest, camera technology of the 70s/80s, I am still quite well equipped with the basics, hopefully.

Not anymore for the X-700, but with the AE-1 I hope to still understand at least the electronical outline.
 
Last edited:
OP
OP
Andreas Thaler

Andreas Thaler

Subscriber
Joined
Nov 19, 2017
Messages
4,300
Location
Vienna/Austria
Format
35mm
Last year I repaired battery acid damage on two fairly rare Nikon intervalometers.

The devices allow the motorized camera to be triggered automatically at adjustable time intervals and frequencies.

The MT-1 dates back to the 70s and works analog:

IMG_4679.jpeg


IMG_4680.jpeg


IMG_4685.jpeg


IMG_4686.jpeg


IMG_4681.jpeg


IMG_4684.jpeg


IMG_4687.jpeg



The MT-2, the successor model, appeared in the 80s and works digitally. A TTL grave 😛

IMG_4678.jpeg


IMG_4677.jpeg


IMG_4676.jpeg


IMG_4682.jpeg


IMG_4683.jpeg


With my level of knowledge, I should at least understand the basics of what's going on in there 🙃
 
Last edited:

koraks

Moderator
Moderator
Joined
Nov 29, 2018
Messages
20,581
Location
Europe
Format
Multi Format
Digital, well, I disappointed my teacher because I didn't want to do the work with the Arduino. As soon as I started programming, I was unhappy. That doesn't suit me at all.

I can relate; I've got the same thing, just the opposite! The MT-2, I could find may way around. The MT-1...well, it would cost me a whole lot more time, I'd really need a schematic to go with it, and even then I'd appreciate it if someone were around to walk me through it!
 
OP
OP
Andreas Thaler

Andreas Thaler

Subscriber
Joined
Nov 19, 2017
Messages
4,300
Location
Vienna/Austria
Format
35mm
The MT-1 … even then I'd appreciate it if someone were around to walk me through it!

That can't be me 😉

But haven't you seen all the resistors there?

A fantastic machine, it works with invisible voltages. A real electronic brain!

U = R * I


Any further questions?

🤣
 
Last edited:
OP
OP
Andreas Thaler

Andreas Thaler

Subscriber
Joined
Nov 19, 2017
Messages
4,300
Location
Vienna/Austria
Format
35mm
Both were certainly expensive in their time. Cutting-edge technologies. And with the MT-1 there's probably a lot of working time too, that looks like soldered by hand.
 
OP
OP
Andreas Thaler

Andreas Thaler

Subscriber
Joined
Nov 19, 2017
Messages
4,300
Location
Vienna/Austria
Format
35mm
And both intervalometers work 🙂

IMG_4688.jpeg


Only on the MT-1 does it look like electrolyte is leaking from this electrolytic capacitor (it should be one). But if it is not a particularly dimensioned component, it can be replaced. If necessary, by connecting several capacitors together to get the target capacity.

Where did I get my book with the formulas ...

IMG_4691.jpeg



IMG_4690.jpeg


IMG_4689.jpeg


Wonderful toys!

By the way, the connecting cable between the intervalometer and motor drive is super rare.
 
Last edited:

koraks

Moderator
Moderator
Joined
Nov 29, 2018
Messages
20,581
Location
Europe
Format
Multi Format
it look like electrolyte is leaking from this electrolytic capacitor

Yeah, that one looks like toast. If it's on a power supply line (which is often fairly easy to determine by poking around with a continuity meter), it's not critical and any decent-value cap will do.
Otherwise, parallel a couple of caps like you said. When putting them in parallel, you can simply add the values.
 
OP
OP
Andreas Thaler

Andreas Thaler

Subscriber
Joined
Nov 19, 2017
Messages
4,300
Location
Vienna/Austria
Format
35mm
By the way, the forum software here is great. I do almost all of my posts on the iPhone 11 Pro Max and editing goes smoothly without any problems. The system is far better than anything comparable I've ever had worked with 👍
 

koraks

Moderator
Moderator
Joined
Nov 29, 2018
Messages
20,581
Location
Europe
Format
Multi Format
the forum software here is great

@Sean, you hear this?
I agree, btw. It may not be perfect, but it's really, really good. It's a XenForo implementation, and I think along with Discourse that's probably the best there is at this point in terms of usability. XenForo is probably the best out of these two in being geared to a classic forum style as we have here.

There are a few issues for instance w.r.t. usability of the moderator back-end (although I can live with those), and sadly, the text editor on Android only seems to support plain text editing at least on my phone. But contrary to many other forums, it's really simple to drag & drop content in posts etc.
 

Neofito

Member
Joined
Jul 31, 2023
Messages
56
Location
Berlin
Format
35mm
It seems that easy, doesn't it.

Maybe, one day. But currently, microcontroller technology doesn't work in this "record & play back" kind of way. Yes, you could train a neural network to produce the same outputs on the same inputs as some chip in an AE1 - theoretically. But theoretically, a lot is possible that in practice is kind of challenging to do.

I guess you could get somewhere if you:
* Pick a simple camera model; i.e. the first generation that had any chips in them at all.
* Pick a model for which you have the complete manufacturer service manual so the entire system is specified in detail, including signal conditions and requirements.
* Have a solid basis in embedded systems design.
* Have a couple hundred to a few thousand hours to spare.

Given a population of 7+ billion, you might say that the above will eventually materialize, and that's also the only reason why I don't dismiss the idea out of hand.

This is not to brag, but to clarify why I'm a bit skeptical. I have a decent amateur working knowledge of microcontrollers and have done many projects involving the full process from requirements identification, electrical engineering, manufacturing, embedded software engineering etc. One thing that's always exceedingly difficult is trying to 'hack into' existing systems, although I've done so on occasion (i.e. reverse engineering a very simple Chinese remote controlled relay array because I really liked how the remote handled). The general consensus if you ask around among electronics enthusiasts is to just gut the device, pick out the components you want to re-use and start over. The equivalent in camera terms would be to leave critical components like light meter, solenoids/magnets, LCD's and buttons in place and then start building your own control hardware to tie it all together. The complexity of such a project far exceeds the realistic value of it; seriously, just chuck the damn thing in the garbage and wait for Pentax to launch their new models!

One of the easiest projects I did (not really a 'project') was the addition of a small microcontroller to a Sigma lens with a Canon mount that refused to work on modern Canon EOS cameras: https://tinker.koraks.nl/photography/potato-potato-making-an-old-sigma-lens-work-on-every-eos-body/ This 'project' ended up consisting of not much more than recreating the work of someone else, but in doing so, I also retraced their steps in reverse engineering the lens mount protocol and analyzing the software they ended up writing. Because the work was already done, this cost me just a few hours of desk research. Because the Canon EOS system uses a bog standard SPI interface, this saved the original inventor countless hours of reverse engineering the lens/camera communication. Because all that's needed in this hack is to literally flip one single bit in one particular bit sequence, it's something that can realistically be done.

What you're proposing is orders of magnitude more complex than the tiny example above. Scale up the flipping of a single bit on a single communication line on a single type of camera to basically substituting the entire controls of a complex camera system... It's not impossible. It's just a little more difficult than I suspect you imagine it to be.

I am talking about the very first generation, like canon ae-1, that basically made a microcontroller version of the analog circuit to compare exposure and telling you if you are under or over exposed, that's it. Of course if you go to more advanced mettering programs and functions is another deal.

If someone is already hacking an sx70, in 10 years I see very easy that some other models get also hacked. I am not saying ALL the models or not even all the oldest micro controlled cameras from the 80s. Of course I really doubt we will ever see someone replacing the brain of an EOS 1.
 
Last edited:

Helge

Member
Joined
Jun 27, 2018
Messages
3,938
Location
Denmark
Format
Medium Format
By the way, the forum software here is great. I do almost all of my posts on the iPhone 11 Pro Max and editing goes smoothly without any problems. The system is far better than anything comparable I've ever had worked with 👍

It’s good as forums go. Makes you wonder why people put up with terrible UIs and bugs on Facebook, Instagram, Reddit, YouTube or Discord.

Still terrible that we are stuck with heavily modal basically glorified markup script from sixties IBM standards, when much better stuff was demonstrated by Engelbart 55 years ago, again at Xerox Parc in the seventies and by Bill Atkinson in HyperCard in 87.
 
Last edited by a moderator:
Photrio.com contains affiliate links to products. We may receive a commission for purchases made through these links.
To read our full affiliate disclosure statement please click Here.

PHOTRIO PARTNERS EQUALLY FUNDING OUR COMMUNITY:



Ilford ADOX Freestyle Photographic Stearman Press Weldon Color Lab Blue Moon Camera & Machine
Top Bottom