Why do you have to activate or deactivate autoneg at a fiber port?
First you have to know what autoneg does: originally with the launch of Fast Ethernet end of 1995 the copper ports should also support 10 Mbit/s. That was easy and cheap. So the "old" Ethernet devices (mainly hubs) could be attached to a "high" speed backbone. Fiber ports usually were used as backbone. The FE port automatically adapted to its neighbor: it recognized the speed and selected its own speed respectively. The duplex mode couldn't be detected. Therefore the standard body IEEE decided that an autoneg port detecting a non-autoneg port as neighbor should use HDX. At that time autoneg didn't work reliably and thus consultants recommended to switch off autoneg wherever possible. This has changed. The author hasn't recognized autoneg issues for a long time.
At fiber ports it was different:
At 10 Mbit/s 860 nm wavelength was used. For Fast Ethernet (100BASE-FX acc. to IEEE 802.3 section 2) IEEE just "copied" the FDDI spec to use the FDDI PMD components and to offer a quick solution. But FDDI (ISO 9314) offering one speed only used 1300 nm as wavelength to cross longer distances. As effect a fiber port supporting 2 speeds needs 2 transmitters and receivers in combination with an optical coupler, which was way too expensive that many pieces would have been sold. Therefore no manufacturer offered it. And therefore also no autoneg was necessary. Therefore FE fiber ports are configured fix.
With Gigabit Ethernet autoneg was needed to exchange some information between both neighbors.
Therefore GE fiber ports need to have autoneg enabled. Communication between ports with disabled autoneg could work, but won't be reliable.
Fast Ethernet fiber ports: autoneg off (or even not offered at all), fixed duplex mode (usually FDX between switches and routers)
Gigabit Ethernet fiber ports: autoneg on