Apple

- NASDAQ:AAPL
Last Updated 2022-11-28

Patent Grants Data

Patents granted to organizations.
Ticker Symbol Entity Name Publication Date Filing Date Patent ID Invention Title Abstract Patent Number Claims Number of Claims Description Application Number Assignee Country Kind Code Kind Code Description url Classification Code Length of Grant Date Added Date Updated Company Name Sector Industry
nasdaq:aapl Apple Apr 26th, 2022 12:00AM Jun 7th, 2021 12:00AM https://www.uspto.gov?id=USD0949902-20220426 Electronic device with graphical user interface D949902 The ornamental design for an electronic device with graphical user interface, as shown and described. 1 The file of this patent contains at least one drawing/photograph executed in color. Copies of this patent with color drawing(s)/photograph(s) will be provided by the Office upon request and payment of the necessary fee. FIG. 1 is a front view of a display screen or portion thereof with graphical user interface showing the claimed design; FIG. 2 is another embodiment thereof; and, FIG. 3 is a front view of an electronic device having a display screen with the graphical user interface of FIG. 1 applied to the display screen. The graphical user interface design of FIG. 2 may be similarly applied thereto. The outer dashed broken lines in the figures show a display screen or portion thereof, or an electronic device having a display screen, and form no part of the claimed design. The other dashed broken lines in the figures show portions of the graphical user interface that form no part of the claimed design. The dot-dash broken lines in the figures and the areas within the dot-dash broken lines show portions of the graphical user interface that form no part of the claimed design. The lined through text in FIGS. 1-3 show portions of the graphical user interface that form no part of the claimed design. 29787422 apple inc. USA S1 Design Patent Open D14/486 15 Apr 27th, 2022 08:45AM Apr 27th, 2022 08:45AM Technology Technology Hardware & Equipment Information Technology
nasdaq:aapl Apple Apr 26th, 2022 12:00AM Jul 12th, 2021 12:00AM https://www.uspto.gov?id=USD0950103-20220426 Building D950103 The ornamental design for a building, as shown and described. 1 FIG. 1 is a top front perspective view of a building showing the claimed design; FIG. 2 is a front view thereof; FIG. 3 is a left side view thereof; and, FIG. 4 is a right side view thereof. The dashed broken lines in the figures show portions of the building that form no part of the claimed design. The oblique shade lines in the figures show transparency or translucency. 29798975 apple inc. USA S1 Design Patent Open D25/25 15 Apr 27th, 2022 08:45AM Apr 27th, 2022 08:45AM Technology Technology Hardware & Equipment Information Technology
nasdaq:aapl Apple Apr 26th, 2022 12:00AM Aug 9th, 2021 12:00AM https://www.uspto.gov?id=USD0949823-20220426 Headphones D949823 The ornamental design for headphones, as shown and described. 1 FIG. 1 is a top front perspective view of headphones showing the claimed design; FIG. 2 is a top rear perspective view thereof; FIG. 3 is a front view thereof; FIG. 4 is a rear view thereof; FIG. 5 is a left side view thereof; FIG. 6 is a right side view thereof; FIG. 7 is a top view thereof; and, FIG. 8 is a bottom view thereof. The broken lines in the figures show portions of the headphones that form no part of the claimed design. 29802919 apple inc. USA S1 Design Patent Open D14/205 15 Apr 27th, 2022 08:45AM Apr 27th, 2022 08:45AM Technology Technology Hardware & Equipment Information Technology
nasdaq:aapl Apple Apr 26th, 2022 12:00AM Mar 5th, 2020 12:00AM https://www.uspto.gov?id=US11317187-20220426 Foldable headphones with multiple operating modes Headphones include left and right earpieces mounted to rotatable arms and connected by a resilient U shaped band. The left and right earpieces can be positioned by folded the earpieces inwards toward the headband or extending the earpieces away from the headband. A sensor can detect the position of the left and right earpieces and transition the headphones between three operating modes, including a playback mode, a standby mode, and an off mode. 11317187 1. A pair of headphones comprising: a headband having first and second opposing ends; a first earpiece pivotally attached to the first end of the headband enabling the first earpiece to be moved between a first position in which the first earpiece is folded inward towards the headband and a second position in which the first earpiece is unfolded; a second earpiece pivotally attached to the second end of the headband enabling the second earpiece to be moved between a first position in which the first earpiece is folded inward towards the headband and a second position in which the second earpiece is unfolded; one or more sensors that generate sensor data indicating whether each of the first and second earpieces are in the first or second position; a rechargeable battery; and control circuitry coupled to the sensor data from the one or more sensors and configured to set the pair of headphones in: (i) a playback mode when both the first and second earpieces are in the second position, (ii) in an off mode when both the first and second earpieces are in the first position, and (iii) in a third operating mode when one of the first or second earpieces is in the first position and the other of the first or second earpieces is in the second position, wherein when the pair of headphones are in the third operating mode audio playback is stopped in both the first and second earpieces and the pair of headphones consume less power than when in the playback mode but more power than when in the off mode. 2. The pair of headphones of claim 1, further comprising a pivot joint coupling the first or second earpiece with the headband, the pivot joint comprising a moveable stop for resisting pivoting, wherein movement of the stop is resisted by a compressible device. 3. The pair of headphones of claim 1, wherein the one or more sensors are positioned adjacent to a pivot join coupling the first or second earpiece with the headband. 4. The pair of headphones of claim 3, wherein the one or more sensors include a proximity sensor, a hall effect sensor, an optical sensor, or a mechanical sensor. 5. The pair of headphones of claim 1, the first or second earpiece comprising a contact plate pivotable to contact one or more buttons coupled to the control circuitry to control one or more predetermined functions. 6. The pair of headphones of claim 5, wherein the one or more predetermined functions include volume up, volume down, play, pause, next track, previous track, fast-forward, and rewind. 7. A pair of headphones comprising: a headband having first and second opposing ends; a first earpiece pivotally attached to the first end of the headband by a first arm enabling the first earpiece to be moved between a first position in which the first earpiece is folded inward towards the headband and a second position in which the first earpiece extends away from the headband; a second earpiece pivotally attached to the second end of the headband by a second arm enabling the second earpiece to be moved between a first position in which the first earpiece is folded inward towards the headband and a second position in which the second earpiece extends away from the headband; one or more sensors that generate sensor data indicating whether each of the first and second earpieces are in the first or second position; and control circuitry coupled to the sensor data from the one or more sensors and configured to set the pair of headphones in: (i) a first operating mode when both the first and second earpieces are in the first position, (ii) in a second operating mode when both the first and second earpieces are in the second position, and (iii) in a third operating mode when one of the first or second earpieces is in the first position and the other of the first or second earpieces is in the second position, wherein when the pair of headphones are in the third operating mode, audio playback is stopped in both the first and second earpieces and the pair of headphones consume less power than when in the second operating mode but more power than when in the first operating mode. 8. The pair of headphones of claim 7, wherein the first operating mode is an off mode, the second operating mode is a playback mode, and the third operating mode is a standby mode. 9. The pair of headphones of claim 7, wherein the first or second earpiece comprises one or more buttons coupled to the control circuitry to control one or more predetermined functions. 10. The pair of headphones of claim 9, wherein the first or second earpiece comprises a contact plate pivotable to contact the one or more buttons. 11. The pair of headphones of claim 7, wherein the headband is pivotally coupled to the first arm or the second arm with a pivot joint comprising a stop for resisting pivoting of the respective arm. 12. The pair of headphones of claim 11, wherein the one or more sensors are positioned adjacent to the pivot joint and include a proximity sensor, a hall effect sensor, an optical sensor, or a mechanical sensor. 13. A pair of headphones comprising: a headband having first and second opposing ends; a first earpiece pivotally attached to the first end of the headband by a first arm enabling the first earpiece to be moved between a first position in which the first earpiece is folded inward towards the headband and a second position in which the first earpiece extends away from the headband; a second earpiece pivotally attached to the second end of the headband by a second arm enabling the second earpiece to be moved between a first position in which the first earpiece is folded inward towards the headband and a second position in which the second earpiece extends away from the headband; one or more sensors that generate sensor data indicating whether each of the first and second earpieces are in the first or second position; and control circuitry coupled to the one or more sensors and configured to receive sensor data and set the pair of headphones in a first operating mode, a second operating mode, and a third operating mode, the control circuitry setting the pair of headphones in: (i) the first mode when both the first and second earpieces are in the second position, (ii) the second mode when both the first and second earpieces are in the first position, and (iii) the third operating mode when one of the first or second earpieces is in the first position and the other of the first or second earpieces is in the second position, wherein when the headphones are in the third operating mode, audio playback is stopped in both the first and second earpieces and the pair of headphones consume less power than in the second operating mode but more power than when in the first operating mode. 14. The pair of headphones of claim 13, wherein in the first operating mode is an off mode, the second operating mode is a playback mode, and in the third operating mode is a low power standby mode. 15. The pair of headphones of claim 13, wherein the headband is pivotally coupled to the first arm or the second arm with a pivot joint comprising a stop for resisting pivoting of the respective arm. 16. The pair of headphones of claim 13, the first earpiece or the second earpiece comprising one or more buttons coupled with the control circuitry, the control circuitry performing a predetermined function based on input from the one or more buttons. 17. The pair of headphones of claim 16, wherein the one or more buttons are covered by a contact plate pivotable to contact the one or more buttons coupled to the control circuitry to control one or more predetermined functions. 17 CROSS-REFERENCE TO RELATED APPLICATION This application claims the benefit of and priority to U.S. Provisional Application No. 62/900,283, filed on Sep. 13, 2019, and titled “HEADPHONES,” the content of which is herein incorporated by reference in its entirety for all purposes. FIELD This disclosure generally relates to headphones and, more specifically, to headphones with multiple operating modes. BACKGROUND OF THE INVENTION Headphones are commonly used to listen to content on electronic devices such as smart phones, tablet computers, laptop computers, televisions and the like. Some headphones are wireless headphones that are powered by one or more internal batteries. The batteries store a limited amount of charge that are depleted through operation of the headphones, requiring recharging before operation of the headphones can continue. Some of these wireless headphones have features like auto on/off and/or can enter a sleep mode to conserve battery power. The auto on/off and sleep modes can extend the battery life of the headphones but can inhibit a user's listening experience if not implemented in an intuitive and easily implemented manner. BRIEF SUMMARY OF THE INVENTION This disclosure describes various embodiments of headphones that include multiple power modes that enable the headphones to conserve battery power when the headphones are not being actively used. In some embodiments, the headphones can fold to automatically transition the headphones between three different power modes including a playback or ON mode, a sleep mode and an OFF mode. For example, both the left and right earpieces can be unfolded in a listening configuration triggering an operating mode for playback; one of the left or right earpieces can be folded with the other earpiece unfolded to trigger a standby mode where battery consumption is reduced but playback can quickly be resumed; and both the left and right earpieces can be folded inward into a storage configuration triggering the headphones to turn off. Folding the left and/or right earpieces to switch between power modes provides a simple and intuitive manner in which a user can change the operation mode of the headphones. For example, if a user is transporting the headphones and wants to conserve the battery power, the user can fold both earpieces into the storage configuration and the headphones will automatically switch to the off mode. Similarly, if a user wants to conserve battery for a period of time but wants to be able to quickly resume playback, one earpiece can be folded triggering the standby mode. The user can then unfold the folded earpiece to trigger the playback mode. Headphones according to some embodiments can include a headband and left and right arms connected to the headband. The left and right arms can connected the left and right earpieces, respectively, to the headband and can each include sensors for detecting whether the arms (and thus the earpieces) are folded inward towards the headband or are in an unfolded state. The left and right arms can fold independently to transition between the different power modes. A pair of headphones is disclosed and includes the following: a headband having first and second opposing ends; a first earpiece pivotally attached to the first end of the headband enabling the first earpiece to be moved between a first position in which the first earpiece is folded inward towards the headband and a second unfolded position; a second earpiece pivotally attached to the second end of the headband enabling the second earpiece to be moved between a first position in which the first earpiece is folded inward towards the headband and a second unfolded; one or more sensors that generate sensor data indicating whether each of the first and second earpieces are in the first or second position; a rechargeable battery; and control circuitry coupled to the sensor data from the one or more sensors and configured to set the pair of headphones in: (i) a playback mode when both the first and second earpieces are in the second position, (ii) in an off mode when both the first and second earpieces are in the first position, and (iii) in a third operating mode when one of the first earpiece 110 or second earpiece 120 is in the first position and the other of the first earpiece 110 or second earpiece 120 is in the second position, wherein when the headphones are in the third operating mode the headphones consume less power than in the playback mode but more power than when in the off mode. To better understand the nature and advantages of the present invention, reference should be made to the following description and the accompanying figures. It is to be understood, however, that each of the figures is provided for the purpose of illustration only and is not intended as a definition of the limits of the scope of the present invention. Also, as a general rule, and unless it is evident to the contrary from the description, where elements in different figures use identical reference numbers, the elements are generally either identical or at least similar in function or purpose. BRIEF DESCRIPTION OF THE DRAWINGS FIG. 1 shows a perspective view of headphones in accordance with some embodiments of the disclosure; FIGS. 2A through 2C are illustrations of different configurations of the headphones of FIG. 1 according to some embodiments of the present invention. FIGS. 3A and 3B are simplified illustrations of a sensor that can be used with the headphones of FIG. 1 according to some embodiments of the present invention; FIG. 4 is a cross section of a hinge that can be used with the headphones of FIG. 1 according to some embodiments of the present invention; FIG. 5 is side view of an earpiece that that can be used with the headphones of FIG. 1 according to some embodiments of the present invention; and FIG. 6 is a cross section of the earpiece of FIG. 5 according to some embodiments of the present invention. DETAILED DESCRIPTION OF THE INVENTION FIG. 1 shows a perspective view of headphones 100 in accordance with some embodiments of the disclosure. The headphones 100 can be, for example, over-ear or on-ear headphones. The headphones 100 can include a first earpiece 110 coupled with a first arm assembly 130 and a second earpiece 120 coupled with a second arm assembly 140. The first arm assembly 130 and the second arm assembly 140 can be interconnected by a headband 150 that in the depicted embodiment is substantially U-shaped or C-shaped to enable the headband to better conform to the contour of a user's head. The headphones 100 can include electronic circuitry and/or components (not shown in FIG. 1) for controlling various functions of the headphones 100. In various embodiments, the electronic circuitry and/or components can include a controller (e.g., a microcontroller, an ASIC, and/or various other logic circuitry and/or discrete components) for controlling audio playback, power modes, and/or communication. In some embodiments the controller can be positioned in the first earpiece 110 and/or the second earpiece 120. The first earpiece 110 can include cushioning 112 for sitting against a user's head and a casing 114 surrounding the internal electronic components of the earpiece. The cushioning 112 can generally conform to the shape of the user's head and/or ear to minimize the travel of sound outside the cushioning. In some embodiments, the cushioning can include a protective layer of material, for example, leather, plastic, silicon, or any suitable material. As discussed further in reference to FIGS. 5 and 6, the casing 114 can include multiple portions, with one or more portions moveable to accommodate movement of one or more internal components. In some embodiments, the first earpiece 110 can pivot relative to the first arm assembly 130. The first earpiece 110 can include electronic components for wireless communication (e.g., Bluetooth or Wi-Fi), one or more battery modules for powering the headphones 100, and/or one or more audio output components (e.g., one or more speakers) for output of audio data. The second earpiece 120 can include some or all of the same or similar components of the first earpiece 110 (e.g., cushioning, a casing, wireless communication components, a battery module, etc.). In various embodiments, the first earpiece 110 and/or the second earpiece 120 can be mounted to position a central axis of the earpiece at an angle relative to a lateral centerline of the headband 150. The central axis of the first earpiece 110 and/or the second earpiece 120 can be, for example, angled between 80 degrees and 100 degrees relative to the lateral centerline of the headband 150. In some embodiments, the first earpiece 110 and/or the second earpiece 120 can include a port 160. The port 160 can be mated with a corresponding cable for transmission of power and/or data to and/or from the headphones 100. For example, port 160 can be mated with a corresponding cable for electrically coupling the headphones 100 with an electronic device, such as a smart phone, tablet computer, laptop computer, power supply or other appropriate electronic device. In some instances the electronic device can transmit both power and audio data to the headphones 100 through port 160 via a cable. In other instances, however, the electronic device can transmit power to headphones 100 through the port 160 via the cable while audio data can be transmitted to the headphones 100 via wireless circuitry (e.g., Bluetooth or Wi-Fi circuitry). In some embodiments, the port 160 can include a receptacle connector, such as a TRS audio jack, a micro-USB connector, USB C connector, a Firewire connector, a lightning connector developed by Apple, or any other suitable type of connector. The first arm assembly 130 and/or the second arm assembly 140 can be pivotally coupled with the headband 150 for pivoting between a first position (e.g., a folded position) and a second position (e.g., an extended position). As discussed further in reference to FIG. 4, in some embodiments, a pivot joint 132 can couple the first arm assembly 130 and/or the second arm assembly 140 with the headband 150. In various embodiments, the first arm assembly 130 and/or the second arm assembly 140 can include a sliding member 170 for coupling the first earpiece 110 with the first arm assembly 130 and/or the second earpiece 120 with the second arm assembly 140. The sliding member 170 can slide internally and relatively to one end of the first arm assembly 130 and/or the second arm assembly 140 to shorten or lengthen the headband as described below. The first arm assembly 130 and/or the second arm assembly 140 and the sliding member 170 can be coupled via a friction-based adjustment mechanism. The friction-based adjustment mechanism can include a channel formed internally in the first arm assembly 130 and/or the second arm assembly 140 for receiving the sliding member 170. The friction-based adjustment mechanism can cause a biasing frictional force between the external surfaces of the sliding member 170 and the internal surfaces of the channel in the first arm assembly 130 and/or the second arm assembly 140. The biasing frictional force can prevent the sliding member 170 from moving without an applied external force. For example, the biasing frictional force can prevent the weight of the first earpiece 110 or the second earpiece 120 from moving the sliding member 170. The channel size and sliding member 170 thickness can be optimized and designed so that the biasing frictional force has a predetermined force range that is overcome before the sliding member 170 can be moved. In some embodiments, the sliding member 170 can include a surface and/or surface treatment that can be applied to adjust the biasing frictional force needed to move the sliding member 170. The sliding member 170 can adjust the size of the headphones 100 to adapt the headphones to provide a more comfortable fit for users. For example, sliding member 170 can be positioned: to minimize the distance between the first earpiece 110 and the distal end of the first arm assembly 130 and the second earpiece 120 and the distal end of the second arm assembly 140; to maximize the distance between the first earpiece 110 and the distal end of the first arm assembly 130 and the second earpiece 120 and the distal end of the second arm assembly 140; or at a position between the maximized distance and the minimized distance. In various embodiments, the sliding member 170 can be partially or fully disposed within the first arm assembly 130 and/or the second arm assembly 140 when the first arm assembly 130 and/or the second arm assembly 140 is in the folded position. A user can adjust the size of the headphones by applying a force to the sliding member 170 to overcome the biasing frictional force. In some embodiments, the force can be applied by pushing or pulling on the first earpiece 110 and/or the second earpiece 120. The headband 150 can include a support structure and one or more layers of padding. The support structure can bias the first earpiece 110 and the second earpiece 120 a distance apart. The biased distance can be smaller than a user's head width, requiring users to apply a force to separate the first earpiece 110 and the second earpiece 120 before wearing the headphones 100. The biasing force can also hold the headphones 100 on a user's head while the user is wearing the headphones 100. The padding can be positioned between the user's head and the support structure to improve the comfort of the headphones 100 for the user. In some embodiments, the padding can be covered with a layer of protective material, for example, leather. FIGS. 2A through 2C are illustrations of different configurations of the headphones 100 of FIG. 1 according to some embodiments of the present invention. In FIG. 2A, the headphones 100 are in a listening configuration; in FIG. 2B, the headphones 100 are in a standby configuration; and in FIG. 2C the headphones 100 are in storage configuration. In FIG. 2A, the headphones 100 are in a listening configuration with both the first earpiece 110 and the second earpiece 120 extending away from the headband enabling the headphones to be worn by a user with the earpieces positioned over the user's ear. In the listening configuration, headphones 100 can be fully powered ON to provide audio playback to the user by outputting audio data received from an electronic connected to the headphones (e.g., via either a wired or wireless connection) through the speakers in each earpiece. A user can activate a sleep mode of headphones 100 by folding one of the first earpiece 110 or the second earpiece 120 inward towards the headband 150 as shown in FIG. 2B. For example, as shown in FIG. 2B, second arm assembly 140 (and thus second earpiece 120) are folded inward towards headband 150 placing the headphones 100 in low power, standby mode to reduce the battery consumption of the headphones 100. The standby mode can reduce power consumption by reducing power supplied to one or more components. For example, audio playback can be stopped and the power supplied to wireless communication circuitry can be reduced and/or stopped. The standby mode reduces power consumption of the headphones 100, increasing the length of operating time before the headphones 100 need to be recharged. For example, in some embodiments the headphones can operate in the standby mode approximately five times longer than they could operate in the playback mode. A user can turn the headphones OFF (or place them in a low power mode that uses even less battery consumption than sleep mode) by folding both the first and second earpieces 110 and 120 inward towards the headband 150 as shown in FIG. 2C, which places the headphones 100 in a configuration that is convenient for storing the headphones in an appropriate case or similar storage component. In some embodiments the OFF power mode power is withheld from substantially all of the electronic components in the headphones except those that allow the headphones to be turned back on. In other embodiments, the OFF power mode can continue to supply power to certain more components that provide desired functionality but switch power off to a sufficient number of components such that the headphones require less power than required in sleep mode or in the playback mode. The headphones 100 can include control circuitry and/or associated components (e.g., one or more sensors as discussed below) to detect when one or both of the first earpiece 110 or the second earpiece 120 are in the folded or unfolded positions and activate the appropriate operational mode based on the earpiece positions. The control circuitry and/or associated components can automatically put the headphones 100 in the standby mode when it detects that one of the first earpiece 110 or second earpiece 120 is in the folded position and the other is in the unfolded position. In the standby mode, the audio playback can be stopped to one or both of the first earpiece 110 or second earpiece 120. The control circuitry and/or associated components can further automatically put the headphones 100 in the OFF mode in response to when it detects that both the first earpiece 110 and the second earpiece 120 have been folded inward towards the headband 150. Conversely, the control circuitry and associated components can deactivate the sleep mode or the OFF mode and place the headphones in playback mode when it detects that both of the earpieces are in the unfolded (i.e., extended) position. For example, the headphones 100 can be reconfigured from the storage configuration to the standby configuration by unfolding one of the first earpiece 110 or second earpiece 120 or to the listening configuration by unfolding both of the first earpiece 110 and second earpiece 120. When the headphones 100 are reconfigured, the control circuitry and associated components detect the change in configuration of the headphones and can automatically transition the headphones from the OFF mode to the standby mode or to the playback mode as appropriate. While embodiments of the disclosure enable headphones 100 to be placed in different operational modes by folding or unfolding the first and second earpieces as discussed above, some embodiments can also include additional user input mechanisms that enable a user to select or change an operational mode of the headphones. As one example, some embodiments can include one or more buttons (e.g., on the outer surface of one of the earpieces) that can be selected or activated by a user to switch the headphones between playback, sleep and OFF modes. As another example, some embodiments can include voice activated controls that enable a user to switch the headphones between the different operational modes. FIGS. 3A and 3B are simplified illustrations of a sensor 310 that can be used with the headphones 100 of FIG. 1 according to some embodiments of the present invention. The sensor 310 can provide data to the controller for controlling functions of the headphones. The sensor 310 is shown as positioned in headband 150, however, sensor 310 can be positioned in the first earpiece 110, the first arm assembly 130, the second earpiece 120, and/or the second arm assembly 140. FIGS. 3A and 3B are discussed in relation to the first arm assembly 130, but it is to be understood that the second arm assembly 140 can also include a sensor similar or identical to sensor 310. In various embodiments, the sensor 310 can detect a position of the first earpiece 110 and/or the first arm assembly 130. For example, the sensor 310 can detect whether the earpiece is folded inward towards the headband, extended away from the headband, and/or at a position between the two. The position of the first arm assembly 130 can be used by the controller to determine whether to transition the headphones 100 to the playback mode, the standby mode, or the off mode. The sensor 310 can include a mechanical sensor, an optical sensor, an electronic sensor, and/or any suitable sensor for sensing the position of the first earpiece 110 and/or the first arm assembly 130. In some embodiments, the sensor 310 can include a hall effect sensor that can sense a position of a magnet 320 relative to the sensor 310. As the magnet 320 is moved relative to the sensor 310, a voltage change can occur in the sensor 310. The sensor 310 can be positioned in the headband 150 and the magnet 320 can be positioned in the first arm assembly 130. The magnet 320 can be positioned in the first arm assembly 130 at pivot joint 132, such that, when the first arm assembly 130 is in an extended position, the magnet 320 is closer to the sensor 310 than when the first arm assembly 130 is in the folded position. For example, as shown in FIG. 3A, the first arm assembly 130 is in the extended position and the sensor 310 is positioned near the magnet 320. In FIG. 3B, the first arm assembly 130 has been reconfigured from the extended position to the folded position and the magnet 320 has moved away from sensor 310. The movement of the magnet 320 relative to the sensor 310 can cause a voltage change in the sensor 310. The controller can use the voltage change in the sensor 310 to determine a position of the first arm assembly 130. FIG. 4 is a cross-section of a pivot joint 132 that can be incorporated into the headphones 100 of FIG. 1 according to some embodiments of the present invention. As shown, the pivot joint 132 can be used to couple a first end of the headband 150 with the first arm assembly 130. However, the pivot joint 132 can additionally or alternatively couple a second end of the headband 150 with the second arm assembly 140. The pivot joint 132 can include a compressible component 410 that can apply a retention force to a stop 420, increasing the force required to pivot the first arm assembly 130. The compressible component 410 can apply a force to the stop 420 to hold the stop 420 against a pivot surface 432. The compressible component 410 can be contained in a channel 412 to prevent the compressible component 410 from moving or flexing out of position. In various embodiments, the compressible component 410 can compress in response to the stop 420 moving. The stop 420 can move in response to the first arm assembly 130 rotating about rotation point 430. In some embodiments, the pivot surface 432 can include a cliff 434, that can increase the force required to move the stop 420 and compress the compressible component 410. For example, as the first arm assembly 130 rotates about rotation point 430, the cliff 434 can push the stop 420 against the compressible component 410. The compressible component 410 can resist the stop 420, preventing the first arm assembly 130 from pivoting until a predetermined force has been applied to the first arm assembly 130. The compressible component 410 can be a spring, rubber, foam, or any suitable compressible material. The stop 420 can be a ball, a cylinder, an oval, or any suitable component. Some embodiments of the disclosure pertain to headphones that include an improved user interface on the casing of at least one of the first earpiece 110 or second earpiece 120 that enables a user to select various operational functions of the headphones. FIG. 5 is side view of an earpiece assembly 500 that that can be incorporated into the headphones 100 of FIG. 1 to provide three separate user-activated buttons behind a single, one-piece contact plate 520 according to some embodiments of the present invention. The earpiece assembly 500 can be coupled with the first arm assembly 130 and/or the second arm assembly 140 via sliding member 170. The earpiece assembly 500 can include earpiece housing 510 and contact plate 520. The earpiece housing 510 can provide support for cushioning 112 and/or protection for electronic components (e.g., the audio output component). The earpiece housing 510 can be or include molded plastic, metal, acrylic, and/or carbon fiber. Contact plate 520 can provide protection for one or more buttons 522 that can receive user input. The user input can control various functions of the headphones 100 (e.g., volume controls and/or audio playback controls). The contact plate 520 can be a plate that pivots about a central pivot point. The buttons 522 can be positioned beneath the contact plate 520 to allow a user to push down on a portion of the contact plate 520 to toggle one or more of the buttons. The buttons 522 can be positioned to toggle when a corresponding portion of the contact plate 520 has been pressed. For example, a first button 522A can be positioned and toggled by a user pushing on a top portion of the contact plate 520, a second button 522B can be positioned and toggled by a user pushing on a middle portion of the contact plate 520, and a third button 522C can be positioned and toggled by a user pushing on a bottom portion of the contact plate 520. In various embodiments, the contact plate 520 can include various indentations and/or surface treatments that can aid in toggling the buttons 522. FIG. 6 is a cross section of a portion the earpiece assembly 500 of FIG. 5 according to some embodiments of the present invention. The earpiece assembly 500 can include contact plate 520 covering one or more buttons 522. The buttons 522 can include a press transfer 524 that can transfer a user's press input to one or more contact pads 526. The contact pads 526 can send a signal to the control to provide input for controlling various operations of the headphones 100. The press transfer 524 can be coupled with the contact plate 520 to maintain contact between the press transfer 524 and the contact plate 520. The press transfer 524 and the contact plate 520 can be coupled using adhesives, fasteners, and/or any suitable connection means. In some embodiments, the contact plate 520 includes receiving points for one or more press transfers. For example, the contact plate 520 can include channels for receiving a portion of the press transfer 524. Press transfer 524 can move in response to a user pressing on the contact plate 520. For example, when a user presses on a portion of the contact plate 520, the press transfer 524 can move in response. The press transfer 524 can move until it contacts the contact pads 526. The contact pads 526 can send a signal to the controller in response to the contact from the press transfer 524. In various embodiments, the press transfers 524 and the contact pads 526 can be separated by varying distances. The varying distances can allow a user to press on the contact plate 520 without causing multiple press transfers 524 to contact their corresponding contact pads 526. For example, a press transfer 524B positioned in the middle of the contact plate 520 can be positioned closer to the contact pad 526B than press transfers 524A and/or 524C. For example, when a user presses on the middle portion of the contact plate 520, the contact plate 520 can depress, moving press transfers 524A, 524B, and 524C. However, the middle press transfer 524B can be positioned closer to its corresponding contact pad 526B to receive the press input before either of press transfer 524A or 524B are able to contact their corresponding contact plates. In various embodiments, the press transfer 524 can act as a pivot point for the contact plate 520. One or more of the press transfers 524 can be positioned to allow the contact plate to pivot in response to a user's press. For example, a press transfer 524B can be positioned in the middle portion of the contact plate 520. The press transfer 524B can allow a user to depress the press transfer 524B by pushing on the middle portion of the contact plate 520. A user can press on the upper and/or the lower portion of the contact plate 520 to depress press transfer 524A or 524C respectively. The contact plate 520 can pivot about press transfer 524B to depress press transfer 524A or 524C without depressing press transfer 524B. In various embodiments, the earpiece assembly 500 can include gasket 530 to prevent or reduce moisture and/or dust and debris from reaching the electronic components inside earpiece assembly 500. The gasket 530 can create a sealed barrier between the electronic components and the contact plate 520. The gasket 530 can be coupled with the one or more press transfers 524 to protect the corresponding contact pads 526. The gasket 530 can be or include rubber, silicon, or any suitable material. 16810080 apple inc. USA B2 Utility Patent Grant (with pre-grant publication) issued on or after January 2, 2001. Open Apr 27th, 2022 08:45AM Apr 27th, 2022 08:45AM Technology Technology Hardware & Equipment Information Technology
nasdaq:aapl Apple Apr 26th, 2022 12:00AM Jun 6th, 2019 12:00AM https://www.uspto.gov?id=US11317405-20220426 Neighbor awareness networking ranging One or more wireless stations may operate to configure Neighbor Awareness Networking (NAN)—direct communication with neighboring wireless stations, e.g., without utilizing an intermediate access point. Scheduling of NAN ranging procedures may include a first wireless station sending first information, including first scheduling preferences and a first ranging role, to a second wireless station. The first wireless station receives second information, including second scheduling preferences and a second ranging role, from the second wireless station. The first wireless station may initiate the ranging procedure based on the scheduling preferences and ranging parameters. Alternatively, the second wireless station and may initiate the ranging procedure based on the scheduling preferences and ranging parameters. 11317405 1. A wireless device, comprising: at least one antenna; at least one radio communicatively coupled to the antenna and configured to perform Wi-Fi communication with a Wi-Fi access point; at least one processor communicatively coupled to the at least one radio, wherein the wireless device is configured to perform voice and/or data communications; wherein the at least one processor is configured to cause the wireless device to: transmit an indication of first ranging capabilities to a peer wireless device via one of a Neighbor Awareness Networking (NAN) discovery beacon, NAN synchronization beacon, or a service discovery frame (SDF), wherein the first ranging capabilities are indicated via a bit that indicates the wireless device is NAN ranging capable; receive, from the peer wireless device, second ranging capabilities indicating NAN ranging role compatibility between the wireless device and the peer wireless device; and initiate, based on the first and second ranging capabilities, a NAN ranging operation, wherein the NAN ranging operation is scheduled as one of a full slot schedule (FSS) NAN ranging operation, including a plurality of pre-specified time-slots and associated parameters, or a partial slot schedule (PSS) NAN ranging operation, including at least one pre-specified time-slot and associated one or more parameters. 2. The wireless device of claim 1, wherein the first ranging capabilities are further indicated via at least one of: a sub-set of ranging capabilities as defined in the Fine Timing Measurement (FTM) protocol; or a set of ranging capabilities as defined in the FTM protocol. 3. The wireless device of claim 1, wherein a schedule of the NAN ranging operation is based on one or more of an updated further availability window, current further availability window, or preferred further availability window of the wireless device. 4. The wireless device of claim 3, wherein Fine Timing Measurement (FTM) protocol parameters are derived from the schedule of the NAN ranging operation. 5. The wireless device of claim 4, wherein the FTM protocol parameters include one or more of a number of bursts exponent, a burst duration, a minimum delta FTM, a partial timer synchronization function (TSF) timer, an as soon as possible (ASAP) parameter, a number of FTMs per burst, FTM format, FTM bandwidth, or burst period. 6. The wireless device of claim 1, wherein when the NAN ranging operation is scheduled as FSS, the plurality of pre-specified time-slots and associated parameters are specified for the entire schedule of the NAN ranging operation by the peer wireless device. 7. The wireless device of claim 1, wherein when the NAN ranging operation is scheduled as PSS, the at least one pre-specified time-slot is and associated one or more parameters are specified by the peer wireless device, and wherein the associated parameters for ranging are based on an updated further availability window map. 8. The wireless device of claim 7, wherein parameters for subsequent time-slots are negotiated separately on a slot-by-slot basis by the wireless device and the peer wireless device. 9. An apparatus, comprising: a memory; and at least one processor in communication with the memory, wherein the at least one processor is configured to: generate instructions to cause transmission of an indication of first ranging capabilities to a peer wireless device via one of a Neighbor Awareness Networking (NAN) discovery beacon, NAN synchronization beacon, or a service discovery frame (SDF), wherein the first ranging capabilities are indicated via a bit that indicates the apparatus is NAN ranging capable; receive, from the peer wireless device, second ranging capabilities indicating NAN ranging role compatibility between the apparatus and the peer wireless device; and generate instructions to cause, based on the first and second ranging capabilities, an initiation of a NAN ranging operation, wherein the NAN ranging operation is scheduled as one of a full slot schedule (FSS) NAN ranging operation, including a plurality of pre-specified time-slots and associated parameters, or a partial slot schedule (PSS) NAN ranging operation, including at least one pre-specified time-slot and associated one or more parameters. 10. The apparatus of claim 9, wherein the first ranging capabilities are further indicated via at least one of: a sub-set of ranging capabilities as defined in the Fine Timing Measurement (FTM) protocol; or a set of ranging capabilities as defined in the FTM protocol. 11. The apparatus of claim 9, wherein a schedule of the NAN ranging operation is based on one or more of an updated further availability window, current further availability window, or preferred further availability window. 12. The apparatus of claim 9, wherein Fine Timing Measurement (FTM) protocol parameters are derived from the schedule of the NAN ranging operation. 13. The apparatus of claim 12, wherein the FTM protocol parameters include one or more of a number of bursts exponent, a burst duration, a minimum delta FTM, a partial timer synchronization function (TSF) timer, an as soon as possible (ASAP) parameter, a number of FTMs per burst, FTM format, FTM bandwidth, or burst period. 14. The apparatus of claim 9, wherein when the NAN ranging operation is scheduled as FSS, the plurality of pre-specified time-slots and associated parameters are specified for the entire schedule of the NAN ranging operation by the peer wireless device. 15. The apparatus of claim 14, wherein when the NAN ranging operation is scheduled as PSS, the at least one pre-specified time-slot is and associated one or more parameters are specified by the peer wireless device, wherein the associated parameters for ranging are based on an updated further availability window map, and wherein parameters for subsequent time-slots are negotiated separately on a slot-by-slot basis with the peer wireless device. 16. A non-transitory computer readable memory medium storing program instructions executable by a processor of a wireless device to: generate instructions to transmit an indication of first ranging capabilities to a peer wireless device via one of a Neighbor Awareness Networking (NAN) discovery beacon, NAN synchronization beacon, or a service discovery frame (SDF), wherein the first ranging capabilities are indicated via a bit that indicates the wireless device is NAN ranging capable; receive, from the peer wireless device, second ranging capabilities indicating NAN ranging role compatibility between the wireless device and the peer wireless device; and generate instructions to initiate, based on the first and second ranging capabilities, a NAN ranging operation, wherein the NAN ranging operation is scheduled as one of a full slot schedule (FSS) NAN ranging operation, including a plurality of pre-specified time-slots and associated parameters, or a partial slot schedule (PSS) NAN ranging operation, including at least one pre-specified time-slot and associated one or more parameters. 17. The non-transitory computer readable memory medium of claim 16, wherein the first ranging capabilities are further indicated via at least one of: a sub-set of ranging capabilities as defined in the Fine Timing Measurement (FTM) protocol; or a set of ranging capabilities as defined in the FTM protocol. 18. The non-transitory computer readable memory medium of claim 16, wherein a schedule of the NAN ranging operation is based on one or more of an updated further availability window, current further availability window, or preferred further availability window of the wireless device, wherein Fine Timing Measurement (FTM) protocol parameters are derived from the schedule of the NAN ranging operation, and wherein the FTM protocol parameters include one or more of a number of bursts exponent, a burst duration, a minimum delta FTM, a partial timer synchronization function (TSF) timer, an as soon as possible (ASAP) parameter, a number of FTMs per burst, FTM format, FTM bandwidth, or burst period. 19. The non-transitory computer readable memory medium of claim 16, wherein when the NAN ranging operation is scheduled as FSS, the plurality of pre-specified time-slots and associated parameters are specified for the entire schedule of the NAN ranging operation by the peer wireless device. 20. The non-transitory computer readable memory medium of claim 16, wherein when the NAN ranging operation is scheduled as PSS, the at least one pre-specified time-slot is and associated one or more parameters are specified by the peer wireless device, wherein the associated parameters for ranging are based on an updated further availability window map, and wherein parameters for subsequent time-slots are negotiated separately on a slot-by-slot basis with the peer wireless device. 20 PRIORITY DATA This application is a continuation of U.S. patent application Ser. No. 15/131,911, titled “Neighbor Awareness Networking Ranging”, filed Apr. 18, 2016 by Su Khiong Yong, Christiaan A Hartman, Yong Liu, Lawrie Kurian, Peter N Heerboth, Guoqing Li, Daniel R Borges, Chiu Ngok E Wong, Saravanan Balasubramaniyan, Tashbeeb Haque, and Anand Rajagopalan, which claims benefit of priority to U.S. Provisional Application Ser. No. 62/149,801, titled “Neighbor Awareness Networking Ranging”, filed Apr. 20, 2015 by Su Khiong Yong, Christiaan A Hartman, Yong Liu, Lawrie Kurian, Peter N Heerboth, Guoqing Li, Daniel R Borges, Chiu Ngok E Wong, Saravanan Balasubramaniyan, Tashbeeb Haque, and Anand Rajagopalan, which is hereby incorporated by reference in its entirety as though fully and completely set forth herein. The claims in the instant application are different than those of the parent application and/or other related applications. The Applicant therefore rescinds any disclaimer of claim scope made in the parent application and/or any predecessor application in relation to the instant application. Any such previous disclaimer and the cited references that it was made to avoid, may need to be revisited. Further, any disclaimer made in the instant application should not be read into or against the parent application and/or other related applications. FIELD The present application relates to wireless communications, including techniques for wireless communication among wireless stations in a wireless networking system. DESCRIPTION OF THE RELATED ART Wireless communication systems are rapidly growing in usage. Further, wireless communication technology has evolved from voice-only communications to also include the transmission of data, such as Internet and multimedia content. A popular short/intermediate range wireless communication standard is wireless local area network (WLAN). Most modern WLANs are based on the IEEE 802.11 standard (or 802.11, for short) and are marketed under the Wi-Fi brand name. WLAN networks link one or more devices to a wireless access point, which in turn provides connectivity to the wider area Internet. In 802.11 systems, devices that wirelessly connect to each other are referred to as “stations”, “mobile stations”, “user devices” or STA or UE for short. Wireless stations can be either wireless access points or wireless clients (or mobile stations). Access points (APs), which are also referred to as wireless routers, act as base stations for the wireless network. APs transmit and receive radio frequency signals for communication with wireless client devices. APs can also typically couple to the Internet in a wired fashion. Wireless clients operating on an 802.11 network can be any of various devices such as laptops, tablet devices, smart phones, or fixed devices such as desktop computers. Wireless client devices are referred to herein as user equipment (or UE for short). Some wireless client devices are also collectively referred to herein as mobile devices or mobile stations (although, as noted above, wireless client devices overall may be stationary devices as well). SUMMARY Embodiments described herein relate to ranging between peer devices. Embodiments relate to a wireless station that includes one or more antennas, one or more radios, and one or more processors coupled (directly or indirectly) to the radios. At least one radio is configured to perform Wi-Fi communications. The wireless station may perform voice and/or data communications, as well as the methods described herein. In some embodiments, one or more wireless stations operate to configure Neighbor Awareness Networking (NAN)—direct communication with neighboring wireless stations, e.g., without utilizing an intermediate access point. Configuration of NAN includes setup, scheduling, and performance of a NAN ranging procedure. Scheduling of a NAN ranging procedure includes a first wireless device sending first information that includes first scheduling preferences and a first ranging role of the first wireless device to a second wireless device. The first wireless device then receives second information that includes second scheduling preferences and a second ranging role of the second wireless device. The first information may also include first ranging parameters associated with the first wireless device and the first wireless device may initiate the ranging procedure based on the scheduling preferences and ranging parameters. Alternatively, the second information may also include second ranging parameters associated with the second wireless device and the second wireless device may initiate the ranging procedure based on the scheduling preferences and ranging parameters. This Summary is intended to provide a brief overview of some of the subject matter described in this document. Accordingly, it will be appreciated that the above-described features are merely examples and should not be construed to narrow the scope or spirit of the subject matter described herein in any way. Other features, aspects, and advantages of the subject matter described herein will become apparent from the following Detailed Description, Figures, and Claims. BRIEF DESCRIPTION OF THE DRAWINGS A better understanding of the present subject matter can be obtained when the following detailed description of the embodiments is considered in conjunction with the following drawings. FIG. 1 illustrates an example WLAN communication system, according to some embodiments. FIG. 2 illustrates an example simplified block diagram of a WLAN Access Point (AP), according to some embodiments. FIG. 3 illustrates an example simplified block diagram of a wireless station (UE), according to some embodiments. FIG. 4 illustrates an example signaling diagram between an unsolicited publishing NAN device and a passively subscribing NAN device, according to embodiments. FIG. 5 illustrates an example signaling diagram between a solicited publishing NAN device and an actively subscribing NAN device, according to embodiments. FIG. 6A illustrates an example block diagram of a method for scheduling a ranging procedure between peer devices, according to some embodiments. FIG. 6B illustrates an example processing element including modules for scheduling a ranging procedure between peer devices, according to some embodiments. FIG. 7 illustrates an example frame schedule for full slot schedule (FSS) scheduling, according to some embodiments. FIGS. 8A-8D illustrate example full slot schedule (FSS) signaling for various values of an as soon as possible (ASAP) parameter, according to some embodiments. FIG. 9 illustrates an example frame schedule for partial slot schedule (PSS) scheduling, according to some embodiments. FIGS. 10A-10D illustrate example partial slot schedule (PSS) signaling for various values of an as soon as possible (ASAP) parameter, according to some embodiments. While the features described herein are susceptible to various modifications and alternative forms, specific embodiments thereof are shown by way of example in the drawings and are herein described in detail. It should be understood, however, that the drawings and detailed description thereto are not intended to be limiting to the particular form disclosed, but on the contrary, the intention is to cover all modifications, equivalents and alternatives falling within the spirit and scope of the subject matter as defined by the appended claims. DETAILED DESCRIPTION Acronyms Various acronyms are used throughout the present application. Definitions of the most prominently used acronyms that may appear throughout the present application are provided below: UE: User Equipment AP: Access Point DL: Downlink (from BS to UE) UL: Uplink (from UE to BS) TX: Transmission/Transmit RX: Reception/Receive LAN: Local Area Network WLAN: Wireless LAN RAT: Radio Access Technology DW: Discovery Window NW: Negotiation Window FAW: Further Availability Window SID: Service ID SInf: Service Information Sinf-Seg: Service Information Segment NW-Req: to request the peer NAN device to present in NW CaOp: Capabilities and Operations elements Security: Security preferences SessionInfo: advertisement_id, session_mac, session_id, port, proto ChList: preferred datapath channels NAN: neighbor awareness networking LPN: low power NAN device NSDP: NAN service discovery proxy TTL: time to live Terminology The following is a glossary of terms used in this disclosure: Memory Medium—Any of various types of non-transitory memory devices or storage devices. The term “memory medium” is intended to include an installation medium, e.g., a CD-ROM, floppy disks, or tape device; a computer system memory or random access memory such as DRAM, DDR RAM, SRAM, EDO RAM, Rambus RAM, etc.; a non-volatile memory such as a Flash, magnetic media, e.g., a hard drive, or optical storage; registers, or other similar types of memory elements, etc. The memory medium may include other types of non-transitory memory as well or combinations thereof. In addition, the memory medium may be located in a first computer system in which the programs are executed, or may be located in a second different computer system which connects to the first computer system over a network, such as the Internet. In the latter instance, the second computer system may provide program instructions to the first computer for execution. The term “memory medium” may include two or more memory mediums which may reside in different locations, e.g., in different computer systems that are connected over a network. The memory medium may store program instructions (e.g., embodied as computer programs) that may be executed by one or more processors. Carrier Medium—a memory medium as described above, as well as a physical transmission medium, such as a bus, network, and/or other physical transmission medium that conveys signals such as electrical, electromagnetic, or digital signals. Computer System—any of various types of computing or processing systems, including a personal computer system (PC), mainframe computer system, workstation, network appliance, Internet appliance, personal digital assistant (PDA), television system, grid computing system, or other device or combinations of devices. In general, the term “computer system” can be broadly defined to encompass any device (or combination of devices) having at least one processor that executes instructions from a memory medium. Mobile Device (or Mobile Station)—any of various types of computer systems devices which are mobile or portable and which performs wireless communications using WLAN communication. Examples of mobile devices include mobile telephones or smart phones (e.g., iPhone™, Android™-based phones), and tablet computers such as iPad™ Samsung Galaxy™, etc. Various other types of devices would fall into this category if they include Wi-Fi or both cellular and Wi-Fi communication capabilities, such as laptop computers (e.g., MacBook™), portable gaming devices (e.g., Nintendo DS™ Play Station Portable™, Gameboy Advance™, iPhone™), portable Internet devices, and other handheld devices, as well as wearable devices such as smart watches, smart glasses, headphones, pendants, earpieces, etc. In general, the term “mobile device” can be broadly defined to encompass any electronic, computing, and/or telecommunications device (or combination of devices) which is easily transported by a user and capable of wireless communication using WLAN or Wi-Fi. Wireless Device (or Wireless Station)—any of various types of computer systems devices which performs wireless communications using WLAN communications. As used herein, the term “wireless device” may refer to a mobile device, as defined above, or to a stationary device, such as a stationary wireless client or a wireless base station. For example a wireless device may be any type of wireless station of an 802.11 system, such as an access point (AP) or a client station (STA or UE). Further examples include televisions, media players (e.g., AppleTV™, Roku™, Amazon FireTV™, Google Chromecast™, etc.), refrigerators, laundry machines, thermostats, and so forth. WLAN—The term “WLAN” has the full breadth of its ordinary meaning, and at least includes a wireless communication network or RAT that is serviced by WLAN access points and which provides connectivity through these access points to the Internet. Most modern WLANs are based on IEEE 802.11 standards and are marketed under the name “Wi-Fi”. A WLAN network is different from a cellular network. Processing Element—refers to various implementations of digital circuitry that perform a function in a computer system. Additionally, processing element may refer to various implementations of analog or mixed-signal (combination of analog and digital) circuitry that perform a function (or functions) in a computer or computer system. Processing elements include, for example, circuits such as an integrated circuit (IC), ASIC (Application Specific Integrated Circuit), portions or circuits of individual processor cores, entire processor cores, individual processors, programmable hardware devices such as a field programmable gate array (FPGA), and/or larger portions of systems that include multiple processors. Low Power NAN device (LPN)—refers to a NAN device (e.g., a wireless device) that may have limited power and may typically operate in a hibernate (or sleep) mode. NAN data link (NDL)—refers to a communication link between peer wireless stations (e.g., peer NAN devices). Note that the peer devices may be in a common (e.g., same) NAN cluster. In addition, a NAN data link may support one or more NAN datapaths between peer wireless stations. Note further that a NAN data link may only belong to a single NAN data cluster. NAN datapath (NDP)—refers to a communication link between peer wireless stations that supports a service. Note that one or more NAN datapaths may be supported by a NAN data link. Additionally, note that a NAN datapath supports a service between wireless stations. Typically, one of the peer wireless stations will be a publisher of the service and the other peer wireless station will be a subscriber to the service. NAN cluster—refers to multiple peer wireless stations linked via synchronization to a common time source (e.g., a common NAN clock). Note that a peer wireless station may be a member of more than one NAN cluster. NAN data cluster (NDC)—refers to a set of peer wireless stations in a common (e.g., same) NAN cluster that share a common base schedule (e.g., a NAN data cluster base schedule). In addition, peer wireless stations in a NAN data cluster may share at least one NAN data link with another member wireless station within the NAN data cluster. Note that a peer wireless station may be a member of more than one NAN cluster; however, as noted previously, a NAN data link belongs to exactly one NAN data cluster. Note further, that in a NAN data cluster, all member peer wireless stations may maintain tight synchronization (e.g., via a NAN data cluster base schedule) amongst each other and may be present at a common (e.g., same) further availability slot(s) (or window(s)) as indicated by a NAN data cluster base schedule. In addition, each NAN data link may have its own NAN data link schedule and the NAN data link schedule may be a superset of a NAN data cluster base schedule. Automatically—refers to an action or operation performed by a computer system (e.g., software executed by the computer system) or device (e.g., circuitry, programmable hardware elements, ASICs, etc.), without user input directly specifying or performing the action or operation. Thus the term “automatically” is in contrast to an operation being manually performed or specified by the user, where the user provides input to directly perform the operation. An automatic procedure may be initiated by input provided by the user, but the subsequent actions that are performed “automatically” are not specified by the user, e.g., are not performed “manually”, where the user specifies each action to perform. For example, a user filling out an electronic form by selecting each field and providing input specifying information (e.g., by typing information, selecting check boxes, radio selections, etc.) is filling out the form manually, even though the computer system must update the form in response to the user actions. The form may be automatically filled out by the computer system where the computer system (e.g., software executing on the computer system) analyzes the fields of the form and fills in the form without any user input specifying the answers to the fields. As indicated above, the user may invoke the automatic filling of the form, but is not involved in the actual filling of the form (e.g., the user is not manually specifying answers to fields but rather they are being automatically completed). The present specification provides various examples of operations being automatically performed in response to actions the user has taken. Concurrent—refers to parallel execution or performance, where tasks, processes, signaling, messaging, or programs are performed in an at least partially overlapping manner. For example, concurrency may be implemented using “strong” or strict parallelism, where tasks are performed (at least partially) in parallel on respective computational elements, or using “weak parallelism”, where the tasks are performed in an interleaved manner, e.g., by time multiplexing of execution threads. Configured to—Various components may be described as “configured to” perform a task or tasks. In such contexts, “configured to” is a broad recitation generally meaning “having structure that” performs the task or tasks during operation. As such, the component can be configured to perform the task even when the component is not currently performing that task (e.g., a set of electrical conductors may be configured to electrically connect a module to another module, even when the two modules are not connected). In some contexts, “configured to” may be a broad recitation of structure generally meaning “having circuitry that” performs the task or tasks during operation. As such, the component can be configured to perform the task even when the component is not currently on. In general, the circuitry that forms the structure corresponding to “configured to” may include hardware circuits. Various components may be described as performing a task or tasks, for convenience in the description. Such descriptions should be interpreted as including the phrase “configured to.” Reciting a component that is configured to perform one or more tasks is expressly intended not to invoke 35 U.S.C. § 112(f) interpretation for that component. FIG. 1—WLAN System FIG. 1 illustrates an example WLAN system according to some embodiments. As shown, the exemplary WLAN system includes a plurality of wireless client stations or devices, or user equipment (UEs), 106 that are configured to communicate over a wireless communication channel 142 with an Access Point (AP) 112. The AP 112 may be a Wi-Fi access point. The AP 112 may communicate via a wired and/or a wireless communication channel 150 with one or more other electronic devices (not shown) and/or another network 152, such as the Internet. Additional electronic devices, such as the remote device 154, may communicate with components of the WLAN system via the network 152. For example, the remote device 154 may be another wireless client station. The WLAN system may be configured to operate according to any of various communications standards, such as the various IEEE 802.11 standards. In some embodiments, at least one wireless device 106 is configured to communicate directly with one or more neighboring mobile devices, without use of the access point 112. Further, in some embodiments, as further described below, a wireless device 106 may be configured to schedule a NAN ranging procedure, including sending information that includes scheduling preferences and a ranging role to another wireless device (e.g., another wireless device 106 and/or access point 112). Wireless device 106 may then receive information that includes scheduling preferences and a ranging role of the other wireless device. The information exchanged between the wireless devices may include ranging parameters. Wireless device 106 may initiate the ranging procedure based on the exchanged scheduling preferences and ranging parameters. FIG. 2—Access Point Block Diagram FIG. 2 illustrates an exemplary block diagram of an access point (AP) 112. It is noted that the block diagram of the AP of FIG. 2 is only one example of a possible system. As shown, the AP 112 may include processor(s) 204 which may execute program instructions for the AP 112. The processor(s) 204 may also be coupled (directly or indirectly) to memory management unit (MMU) 240, which may be configured to receive addresses from the processor(s) 204 and to translate those addresses to locations in memory (e.g., memory 260 and read only memory (ROM) 250) or to other circuits or devices. The AP 112 may include at least one network port 270. The network port 270 may be configured to couple to a wired network and provide a plurality of devices, such as mobile devices 106, access to the Internet. For example, the network port 270 (or an additional network port) may be configured to couple to a local network, such as a home network or an enterprise network. For example, port 270 may be an Ethernet port. The local network may provide connectivity to additional networks, such as the Internet. The AP 112 may include at least one antenna 234, which may be configured to operate as a wireless transceiver and may be further configured to communicate with mobile device 106 via wireless communication circuitry 230. The antenna 234 communicates with the wireless communication circuitry 230 via communication chain 232. Communication chain 232 may include one or more receive chains, one or more transmit chains or both. The wireless communication circuitry 230 may be configured to communicate via Wi-Fi or WLAN, e.g., 802.11. The wireless communication circuitry 230 may also, or alternatively, be configured to communicate via various other wireless communication technologies, including, but not limited to, Long-Term Evolution (LTE), LTE Advanced (LTE-A), Global System for Mobile (GSM), Wideband Code Division Multiple Access (WCDMA), CDMA2000, etc., for example when the AP is co-located with a base station in case of a small cell, or in other instances when it may be desirable for the AP 112 to communicate via various different wireless communication technologies. Further, in some embodiments, as further described below, AP 112 may be configured to schedule a NAN ranging procedure, including sending information that includes scheduling preferences and a ranging role to a wireless device (e.g., wireless device 106). AP 112 may then receive information that includes scheduling preferences and a ranging role of the wireless device. The information exchanged between the AP and wireless device may include ranging parameters. AP 112 may initiate the ranging procedure based on the exchanged scheduling preferences and ranging parameters. FIG. 3—Client Station Block Diagram FIG. 3 illustrates an example simplified block diagram of a client station 106. According to embodiments, client station 106 may be a user equipment (UE) device, a mobile device or mobile station, and/or a wireless device or wireless station. As shown, the client station 106 may include a system on chip (SOC) 300, which may include portions for various purposes. The SOC 300 may be coupled to various other circuits of the client station 106. For example, the client station 106 may include various types of memory (e.g., including NAND flash 310), a connector interface (I/F) (or dock) 320 (e.g., for coupling to a computer system, dock, charging station, etc.), the display 360, cellular communication circuitry 330 such as for LTE, GSM, etc., and short to medium range wireless communication circuitry 329 (e.g., Bluetooth™ and WLAN circuitry). The client station 106 may further include one or more smart cards 310 that incorporate SIM (Subscriber Identity Module) functionality, such as one or more UICC(s) (Universal Integrated Circuit Card(s)) cards 345. The cellular communication circuitry 330 may couple to one or more antennas, such as antennas 335 and 336 as shown. The short to medium range wireless communication circuitry 329 may also couple to one or more antennas, such as antennas 337 and 338 as shown. Alternatively, the short to medium range wireless communication circuitry 329 may couple to the antennas 335 and 336 in addition to, or instead of, coupling to the antennas 337 and 338. The short to medium range wireless communication circuitry 329 may include multiple receive chains and/or multiple transmit chains for receiving and/or transmitting multiple spatial streams, such as in a multiple-input multiple output (MIMO) configuration. As shown, the SOC 300 may include processor(s) 302, which may execute program instructions for the client station 106 and display circuitry 304, which may perform graphics processing and provide display signals to the display 360. The processor(s) 302 may also be coupled to memory management unit (MMU) 340, which may be configured to receive addresses from the processor(s) 302 and translate those addresses to locations in memory (e.g., memory 306, read only memory (ROM) 350, NAND flash memory 310) and/or to other circuits or devices, such as the display circuitry 304, cellular communication circuitry 330, short range wireless communication circuitry 329, connector interface (I/F) 320, and/or display 360. The MMU 340 may be configured to perform memory protection and page table translation or set up. In some embodiments, the MMU 340 may be included as a portion of the processor(s) 302. As noted above, the client station 106 may be configured to communicate wirelessly directly with one or more neighboring client stations. The client station 106 may be configured to communicate according to a WLAN RAT for communication in a WLAN network, such as that shown in FIG. 1. Further, in some embodiments, as further described below, client station 106 may be configured to schedule a NAN ranging procedure, including sending information that includes scheduling preferences and a ranging role to another wireless device (e.g., another wireless device 106 and/or access point 112). Client station 106 may then receive information that includes scheduling preferences and a ranging role of the other wireless device. The information exchanged between the wireless devices may include ranging parameters. Client station 106 may initiate the ranging procedure based on the exchanged scheduling preferences and ranging parameters. As described herein, the client station 106 may include hardware and software components for implementing the features described herein. For example, the processor 302 of the client station 106 may be configured to implement part or all of the features described herein, e.g., by executing program instructions stored on a memory medium (e.g., a non-transitory computer-readable memory medium). Alternatively (or in addition), processor 302 may be configured as a programmable hardware element, such as an FPGA (Field Programmable Gate Array), or as an ASIC (Application Specific Integrated Circuit). Alternatively (or in addition) the processor 302 of the UE 106, in conjunction with one or more of the other components 300, 304, 306, 310, 320, 330, 335, 340, 345, 350, 360 may be configured to implement part or all of the features described herein. In addition, as described herein, processor 302 may include one or more processing elements. Thus, processor 302 may include one or more integrated circuits (ICs) that are configured to perform the functions of processor 302. In addition, each integrated circuit may include circuitry (e.g., first circuitry, second circuitry, etc.) configured to perform the functions of processor(s) 204. Further, as described herein, cellular communication circuitry 330 and short range wireless communication circuitry 329 may each include one or more processing elements. In other words, one or more processing elements may be included in cellular communication circuitry 330 and also in short range wireless communication circuitry 329. Thus, each of cellular communication circuitry 330 and short range wireless communication circuitry 329 may include one or more integrated circuits (ICs) that are configured to perform the functions of cellular communication circuitry 330 and short range wireless communication circuitry 329, respectively. In addition, each integrated circuit may include circuitry (e.g., first circuitry, second circuitry, etc.) configured to perform the functions of cellular communication circuitry 330 and short range wireless communication circuitry 329. Wi-Fi Peer to Peer Communication Protocol In some embodiments, Wi-Fi devices (e.g., client station 106) may be able to communicate with each other in a peer to peer manner, e.g., without the communications going through an intervening access point. There are currently two types of Wi-Fi peer to peer networking protocols in the Wi-Fi Alliance. In one type of peer to peer protocol, when two Wi-Fi devices (e.g., wireless stations) communicate with each other, one of the Wi-Fi devices essentially acts as a pseudo access point and the other acts as a client device. In a second type of Wi-Fi peer to peer protocol, referred to as a neighbor awareness networking (NAN), the two Wi-Fi client devices (wireless stations) act as similar peer devices in communicating with each other, e.g., neither one behaves as an access point. In a NAN system, each wireless station may implement methods to ensure that it is synchronized with a neighboring wireless station with which it is communicating. Further, a wireless station may negotiate a common discovery window for exchange of synchronization packets to help ensure the devices that are communicating directly with each other are properly synchronized to enable the communication. Once two wireless stations have the same discovery window they may exchange synchronization packets to stay synchronized with each other. The wireless stations may also use the discovery window to exchange service discovery frames to convey other information such as further availability. The NAN protocol includes two aspects: 1) synchronization and discovery (NAN 1.0) and 2) datapath transmission (NAN 2.0). NAN 1.0 describes methods for NAN protocol synchronization and discovery. One feature described in NAN 2.0 is ranging. In NAN 2.0, two NAN devices may determine a distance between them by leveraging the fine timing measurement (FTM) protocol specified in IEEE 802.11mc. Note that although FTM is primarily intended to support access point to station measurements (AP-STA mode), it may also be used for peer-to-peer measurements (P2P mode). However, the FTM protocol may not scale well in scenarios where there are many STAs attempting to perform ranging (e.g. in a stadium or other large venue). Additionally, the existing FTM protocol cannot be used “as-is” in the NAN framework. For example, FTM protocol has its own scheduling distinct from NAN scheduling. Embodiments described herein provide systems and methods for a NAN ranging protocol. The NAN ranging protocol includes NAN ranging capability discovery (RCD), NAN ranging setup (including ranging role, scheduling, and measurements), and NAN ranging termination. NAN Ranging Capability Discovery In some embodiments, a NAN device may support, or be capable of performing, ranging. In other words, a NAN device may support ranging capability. In such embodiments, a ranging capable NAN device (“NR-DEV”) may indicate its ranging capability in at least one of a NAN synchronization beacon, a NAN discovery beacon, and/or service discovery frames (SDFs). Note that a ranging capability indication (e.g., as included in one or more SDFs, NAN synchronization beacons, NAN discovery beacons, and/or other messaging) may be provided by including a bit to indicate if the device is ranging capable, identifying a sub-set of ranging capabilities as defined in the FTM protocol, or identifying a full set of ranging capabilities as defined in the FTM protocol. Additionally, in some embodiments, a NR-DEV may only negotiate and/or initiate ranging with peer NR-DEVs with a compatible role. Table 1 shows possible ranging roles of a NR-DEV acting as a subscriber/publisher according to some embodiments. It should be noted that the possible ranging roles shown in Table 1 are exemplary only and other combinations of roles are envisioned. TABLE 1 NR-DEV Ranging Roles Subscriber Passive Subscribe Active Subscribe Ranging Ranging Ranging Ranging Initiator Responder Initiator Responder Publisher Unsolicited Ranging Invalid Allowed Invalid Not Publish Initiator Allowed Ranging Allowed Invalid Not Invalid Responder Allowed Solicited Ranging Invalid Not Invalid Allowed Publish Initiator Allowed Ranging Not Invalid Allowed Invalid Responder Allowed As shown in Table 1, a NR-DEV may be a ranging initiator or a ranging responder. Note that at a minimum, a NR-DEV will support the role of ranging responder. In addition, a NR-DEV may also support the role of ranging initiator. In other words, the minimum requirement for a device to be a NR-DEV is support of the role of ranging responder. Additionally, as shown in Table 1, either of a NR-DEV publisher or NR-DEV subscriber may assume a role of initiator. For example, an unsolicited publisher may only take the role of ranging initiator if a subscriber is a passive subscriber that can only take the role of the ranging responder. As another example, a solicited publisher may only take the role of ranging responder if a subscriber is an active subscriber that may only take the role of the ranging initiator. Note that if both subscriber and publisher are ranging initiator and ranging responder capable, the service subscriber may decide which ranging role it will assume (take). NAN Ranging Scheduling In some embodiments, NAN ranging may be based on the FTM protocol. Thus, a NAN device that intends to perform ranging may schedule a time (or times) and a channel (or channels) to perform FTM procedures. According to NAN scheduling, a NAN schedule of a NAN device is based on an updatedFA (updated further availability window) which may be determined by its currentFA (current further availability window) and preferredFA (preferred further availability window). Note that the currentFA may indicate a current further availability map and the preferredFA may indicate other further availability windows preferred by the NAN device in addition to the currentFA. Additionally, the updatedFA may indicate an updated further availability map between peer NAN devices and may serve as the currentFA for future use. In some embodiments, the NAN schedule (indicated by the updatedFA) may be used to determine a ranging schedule. The ranging schedule may be used to derive corresponding FTM parameters such as number of bursts exponent, burst duration, minimum delta FTM, partial timer synchronization function (TSF) timer, ASAP (as soon as possible) parameter, FTMs per burst, FTM format and bandwidth, and burst period, among other FTM parameters. In some embodiments, the ranging role selected by the NR-DEV acting as a subscriber/publisher may depend on requirements of a specific service and may be negotiated prior to performing ranging measurements. The negotiation may be performed as part of SDF exchanges. Exemplary signaling diagrams for various scenarios, e.g., as shown in Table 1 above, are illustrated in FIGS. 4 and 5. In particular, FIG. 4 illustrates an exemplary signaling diagram between an unsolicited publishing NAN device and a passively subscribing NAN device. FIG. 5 illustrates an exemplary signaling diagram between a solicited publishing NAN device and an actively subscribing NAN device. Note that embodiments illustrated in FIGS. 4 and 5 may be used in conjunction with any of the systems or devices shown in the above Figures, among other devices. In various embodiments, some of the signaling shown may be performed concurrently, in a different order than shown, or may be omitted. Additional signaling may also be performed as desired. The procedure (or method, technique) to schedule ranging between NAN devices may proceed as follows. Turning to FIG. 4, at 410, a first NAN device, such as publisher 406a (e.g., a publishing NAN device or publisher), may send a first service discovery frame (SDF) to a second NAN device, such as subscriber 406b (e.g., to publish a service). Note that publisher 406a and subscriber 406b may include any or all of the features described above in reference to client station 106, among other features. In some embodiments, the first SDF may include parameters such as a service description (SD), current availability (FA), preferred availability (preferredFA), and capabilities such as a ranging role (RR). In some embodiments, if the first NAN device has a ranging initiator role, the first SDF may also include an initial FTM request (or an FTM attribute carried in the first SDF) that includes the first NAN device's preferred FTM parameters. In some embodiments, the initial FTM request may include an as soon as possible (ASAP) parameter. At 420, the second NAN device (e.g., subscriber 406b) may send a second SDF in response to the first SDF. The second SDF may include parameters such as a service description (SD), current availability (FA), preferred availability (preferredFA), and capabilities such as a ranging role (RR). In some embodiments, if the second NAN device has a ranging initiator role, the second SDF may also include an initial FTM request (or an FTM attribute carried in the second SDF) that includes the second NAN device's preferred FTM parameters. Alternatively, if the second NAN device has a ranging responder role, the second SDF may also include an initial FTM response (or an FTM attribute carried in the second SDF). In some embodiments, the initial FTM request may include an as soon as possible (ASAP) parameter. At 430, the first NAN device may send a third SDF to confirm the updatedFA and/or an initial FTM response for NAN ranging if an initial FTM request or FTM attribute was sent in 420. At 440, the initiator may initiate FTM procedures based on whether an initial FTM request was included in 420. Thus, if the initial FTM request was not included in 420, the FTM request may be negotiated using standard FTM procedures (e.g., an initial FTM request (+ack) and FTM_1 response (+ack) as part of 440). In some embodiments the ranging negotiation may occur in a first available time-slot based on the updatedFA negotiated in 430, as further described below. In such embodiments, the initiator/responder may honor the currentFA (e.g., the updatedFA in 430) in the ranging negotiation or the initiator/responder may send an SDF frame to update the currentFA (e.g., the updatedFA in 430) if the ranging negotiation results are different from the currentFA in terms of a need for higher bandwidth. In some embodiments, the SDF may be sent after the ranging negotiation is completed in the current slot as well as in the next discovery window (DW). In some embodiments, if an initial FTM request was included at 420, the initial FTM request (+ack) and FTM_1 response (+ack) may be skipped in the FTM procedures. In other words, the initiator may start by sending an FTM trigger frame directly in the first available time-slot since the FTM has already been requested (FTM request) and acknowledged (FTM response). Upon conclusion of the FTM procedure, the initiator may send a range value to the responder via SDF. The range value may be sent on a per time-slot basis or periodically (averaging over multiple measurements). Turning to FIG. 5, at 510, a first NAN device, such as publisher 506a (e.g., a publishing NAN device or publisher), may receive a first SDF rom a second NAN device, such as subscriber 506B (e.g., to subscribe to a service). Note that publisher 506a and subscriber 506b may include any or all of the features described above in reference to client station 106, among other features. In some embodiments, the first SDF may include parameters such as a service description (SD), current availability (FA), preferred availability (preferredFA), and capabilities such as a ranging role (RR). In some embodiments, if the second NAN device has a ranging initiator role, the first SDF may also include an initial FTM request (or an FTM attribute carried in the first SDF) that includes the second NAN device's preferred FTM parameters. In some embodiments, the initial FTM request may include an as soon as possible (ASAP) parameter. At 520, the first NAN device may send a second SDF in response to the first SDF. The second SDF may include parameters such as a service description (SD), current availability (FA), preferred availability (preferredFA), and capabilities such as a ranging role (RR). In some embodiments, if the first NAN device has a ranging initiator role, the second SDF may also include an initial FTM request (or an FTM attribute carried in the second SDF) that includes the first NAN device's preferred FTM parameters. Alternatively, if the first NAN device has a ranging responder role, the second SDF may also include an initial FTM response (or an FTM attribute carried in the second SDF). In some embodiments, the initial FTM request may include an as soon as possible (ASAP) parameter. At 530, the second NAN device may send a third SDF to confirm the updatedFA and/or an initial FTM response for NAN ranging if an initial FTM request or FTM attribute was sent in 520. At 540, the initiator may initiate FTM procedures based on whether an initial FTM request was included in 520. Thus, if the initial FTM request was not included in 520, the FTM request may be negotiated using standard FTM procedures (e.g., an initial FTM request (+ack) and FTM_1 response (+ack) as part of 540). In some embodiments the ranging negotiation may occur in a first available time-slot based on the updatedFA negotiated in 530, as further described below. In such embodiments, the initiator/responder may honor the currentFA (e.g., the updatedFA in 530) in the ranging negotiation or the initiator/responder may send an SDF frame to update the currentFA (e.g., the updatedFA in 530) if the ranging negotiation results are different from the currentFA in terms of a need for higher bandwidth. In some embodiments, the SDF may be sent after the ranging negotiation is completed in the current slot as well as in the next discovery window (DW). In some embodiments, if an initial FTM request was included at 520, the initial FTM request (+ack) and FTM_1 response (+ack) may be skipped in the FTM procedures. In other words, the initiator may start by sending an FTM trigger frame directly in the first available time-slot since the FTM has already been requested (FTM request) and acknowledged (FTM response). Upon conclusion of the FTM procedure, the initiator may send a range value to the responder via SDF. The range value may be sent on a per time-slot basis or periodically (averaging over multiple measurements). FIG. 6A illustrates a block diagram of a method for scheduling a ranging procedure between peer devices, according to some embodiments. The method shown in FIG. 6A may be used in conjunction with any of the systems or devices shown in the above Figures, among other devices. In various embodiments, some of the method elements shown may be performed concurrently, in a different order than shown, or may be omitted. Additional method elements may also be performed as desired. As shown, this method may operate as follows. At 602, a wireless device may send first information to a neighboring wireless device. The first information may include scheduling preferences of the wireless device. In addition, the first information may include a ranging role associated with the wireless device. In some embodiments, the scheduling preferences may include parameters such as current availability and/or preferred availability of the wireless device. In addition, the wireless device may specify ranging capability via the ranging role. In other words, the ranging role may indicate (or specify) whether the wireless device may operate as a ranging initiator and responder, or as a ranging responder only. In some embodiments, if the ranging role indicates the wireless device may be an initiator, the neighboring wireless device may determine that the wireless device may also be a responder. However, if the ranging role indicates that the wireless device may be a responder, the neighboring wireless device may determine that the wireless device may not be an initiator. In some embodiments, the first information may also include a service description (SD). At 604, the wireless device may receive second information from the neighboring wireless device. The second information may include scheduling preferences of the neighboring wireless device. In addition, the second information may include a ranging role associated with the neighboring wireless device. In some embodiments, the scheduling preferences may include parameters such as current availability and/or preferred availability of the neighboring wireless device. In addition, the neighboring wireless device may specify its ranging capability via the ranging role. In other words, the ranging role may indicate (or specify) whether the neighboring wireless device may operate as a ranging initiator and a respond, or as a ranging responder only. In some embodiments, if the ranging role indicates the neighboring wireless device may be an initiator, the wireless device may determine that the neighboring wireless device may also be a responder. However, if the ranging role indicates that the neighboring wireless device may be a responder, the wireless device may determine that the neighboring wireless device may not be an initiator. In some embodiments, the second information may also include a service description (SD). At 606, the wireless device may perform a ranging procedure with the neighboring wireless device. The ranging procedure may be based on the ranging roles and scheduling preferences exchanged between the wireless devices. In some embodiments, the wireless device may initiate the ranging procedure, e.g., if the wireless device has ranging capabilities including initiator and/or responder roles and the wireless device determines a role as initiator. In some embodiments, the neighboring wireless device may initiate the ranging procedure, e.g., if the wireless device does not have ranging capabilities including initiator role and/or if the neighboring wireless device has ranging capabilities including initiator and/or responder and determines a role as initiator. FIG. 6B illustrates a processing element including modules for scheduling a ranging procedure between peer devices, according to some embodiments. In some embodiments, antenna 635 may be coupled to processing element 664. The processing element may be configured to perform the method described above in reference to FIG. 6A. In some embodiments, processing element 664 may include one or more modules, such as modules (or circuitry) 622-626, and the modules (or circuitry) may be configured to perform various operations of the method described above in reference to FIG. 6A. In some embodiments, the processing element may be included in a device such as client station 106. In other embodiments, the processing element may be included in a baseband processor or radio of a device such as client station 106. As shown, the modules may be configured as follows. In some embodiments, processing element 664 may include a send module 622 configured to send first information to a neighboring wireless device. The first information may include scheduling preferences of the wireless device. In addition, the first information may include a ranging role associated with the wireless device. In some embodiments, the scheduling preferences may include parameters such as current availability, preferred availability of the wireless device. In addition, the wireless device may specify ranging capability via the ranging role. In other words, the ranging role may indicate (or specify) whether the wireless device may be a ranging initiator and respond or a ranging responder only. In some embodiments, if the ranging role indicates the wireless device may be an initiator, the neighboring wireless may determine that the wireless device may also be a responder. However, if the ranging role indicates that the wireless device may be a responder, the neighboring wireless device may determine that the wireless device may not be an initiator. In some embodiments, the first information may also include a service description (SD). In some embodiments, processing element 664 may include a receive module 624 configured to receive second information from the neighboring wireless device. The second information may include scheduling preferences of the neighboring wireless device. In addition, the second information may include a ranging role associated with the neighboring wireless device. In some embodiments, the scheduling preferences may include parameters such as current availability, preferred availability of the neighboring wireless device. In addition, the neighboring wireless device may specify ranging capability via the ranging role. In other words, the ranging role may indicate (or specify) whether the neighboring wireless device may be a ranging initiator and respond or a ranging responder only. In some embodiments, if the ranging role indicates the neighboring wireless device may be an initiator, the wireless may determine that the neighboring wireless device may also be a responder. However, if the ranging role indicates that the neighboring wireless device may be a responder, the wireless device may determine that the neighboring wireless device may not be an initiator. In some embodiments, the second information may also include a service description (SD). In some embodiments, processing element 664 may include a perform module 626 configured to perform a ranging procedure with the neighboring wireless device. The ranging procedure may be based on the ranging roles and scheduling preferences exchanged between the wireless devices. In some embodiments, the wireless device may initiate the ranging procedure, e.g., if the wireless device has ranging capabilities including initiator and/or responder roles and the wireless device determines a role as initiator. In some embodiments, the neighboring wireless device may initiate the ranging procedure, e.g., if the wireless device does not have ranging capabilities including initiator role and/or if the neighboring wireless device has ranging capabilities including initiator and/or responder and determines a role as initiator. It is apparent for those skilled in the art that, for the particular processes of the modules (or circuitry) described above (such as modules 622, 624, and 626) reference may be made to the corresponding operations (such as operations 602, 604, and 606, respectively) in the related process embodiment sharing the same concept and the reference is regarded as the disclosure of the related modules (or circuitry) as well. Furthermore, processing element 664 may be implemented in software, hardware or combination thereof. More specifically, processing element 664 may be implemented as circuits such as an ASIC (Application Specific Integrated Circuit), portions or circuits of individual processor cores, entire processor cores, individual processors, programmable hardware devices such as a field programmable gate array (FPGA), and/or larger portions of systems that include multiple processors. Additionally, processing element 664 may be implemented as a general-purpose processor such as a CPU, and therefore each module can be implemented with the CPU executing instructions stored in a memory which perform a respective operation. NAN Ranging Setup In some embodiments, a NAN ranging schedule may span multiple time-slots and each time-slot may be on a different channel with different bandwidth availabilities. Thus, a ranging operation can be scheduled as a full slot schedule (FSS) or a per slot schedule (PSS). In FSS mode, time-slots used and associated FTM Parameters in the time-slots may be pre-specified for the entire ranging schedule by a responder. One benefit of FSS mode can be that there is no need for re-negotiation of the FTM parameters in each time-slot. In other words, one set of FTM Parameters may be used over the available time-slots (e.g. number of FTMs per burst, burst period, bandwidth, etc.). In some embodiments, this option may be limited to a time-slot with the most resource constraints. Alternatively, in some embodiments, a single set of FTM parameters per time-slot may be used to allow for more scheduling flexibility and to provide higher accuracy. However, such embodiments may require more overhead due to increased complexity. In some embodiments, FSS may be used to track a NR-DEV by scheduling measurement bursts periodically. In some embodiments, there may be multiple RTT (round trip time) measurements within each burst. In some embodiments, NAN devices (or applications) may set ranging to be performed at a pre-determined schedule without negotiation (e.g. every 512 TU with 2 RTT measurements) or ranging may be performed only at predefined communications windows, such as discovery windows, discovery window extensions (time window right after discovery windows), or paging windows (rendezvous windows for active NAN data paths). FIG. 7 illustrates a frame schedule for full slot schedule (FSS) scheduling, according to some embodiments. As illustrated, a frame may include a discovery window 730 on a 2.4 GHz channel (e.g., channel 6) and a discovery window 735 on a 5 GHz channel (e.g., channel 149). In addition, each discovery window may be followed by one or more negotiation windows (e.g., negotiation windows 740 and 745). Further, a further availability window may be scheduled on the 2.4 GHz channel (e.g., negotiation window 750) and the 5 GHz channel (e.g., negotiation window 755) as well as one or more additional channels (e.g., channels x and/or y and negotiation windows 757 and 759). A NAN ranging procedure as described above and further discussed below may be schedule during at least a portion of each further availability window (e.g., ranging procedures 760, 765, 767, and 769), as illustrated. FIGS. 8A-8D illustrate full slot schedule (FSS) signaling for various values of an as soon as possible (ASAP) parameter, according to some embodiments. FIGS. 8A and 8B illustrate FSS scheduling and timing for an FSS with an initial FTM request including an as soon as possible (ASAP) parameter value of 0, according to some embodiments. FIGS. 8C and 8D illustrate FSS scheduling and timing for an FSS with an initial FTM request included in signaling step two for an ASAP parameter value of 1. Turning to FIG. 8A, the signaling shown in FIG. 8A may be used in conjunction with any of the systems, methods, or devices shown in the above Figures, among other systems, devices, and methods. In various embodiments, some of the signaling shown may be performed concurrently, in a different order than shown, or may be omitted. Additional signaling may also be performed as desired. As shown, the signaling may operate as follows. At 802, signaling may be exchanged between initiator (e.g., a client station such as client station 106 described above) and responder (e.g., another client station, such as client station 106) to negotiate the parameters for the ranging. Thus, the initiator may send an FTM request (FTM req) to the responder and may receive an acknowledgment from the responder. In addition, the responder may send ranging parameters and/or a first response (FTM_0) and the initiator may acknowledge the response. At 804, additional signaling may be exchanged between the initiator and responder to determine a first round trip time (RTT). Thus, the initiator may send an FTM request (FTM req) to the responder and may receive an acknowledgment from the responder. In addition, the responder may send a second and third response (FTM_1 and FTM_2) and the initiator may acknowledge the responses. In addition, the initiator may use timing parameters associated with the second and third responses (e.g., time sent, time received) to calculate the first RTT. At 806, additional signaling may be exchanged between the initiator and responder to determine second and third RTTs. Thus, the initiator may send an FTM request (FTM req) to the responder and may receive an acknowledgment from the responder. In addition, the responder may send a fourth and fifth response (FTM_3 and FTM_4) and the initiator may acknowledge the responses. In addition, the initiator may use timing parameters associated with the third, fourth, and fifth responses (e.g., time sent, time received) to calculate the second and third RTTs. Note additionally, that the signaling shown in FIG. 8A may be performed in conjunction with the signaling described in reference to FIGS. 4 and 5. For example, operations 802-806 may be included at 440a-n (or 540a-n). Alternatively, operation 802 may be included at 420 (or 520) and thus may not be required as part of 440a-n (or 540a-n) (e.g., if negotiation regarding the ranging occurs at 420 (or 520), then the negotiation of operation 802 may be skipped (or omitted) and the signaling may proceed with 804 and 806. Turning to FIG. 8B, the signaling shown in FIG. 8B may be used in conjunction with any of the systems, methods, or devices shown in the above Figures, among other systems, devices, and methods. In various embodiments, some of the signaling shown may be performed concurrently, in a different order than shown, or may be omitted. Additional signaling may also be performed as desired. As shown, the signaling may operate as follows. At 812, signaling may be exchanged between initiator (e.g., a client station such as client station 106 described above) and responder (e.g., another client station, such as client station 106) to negotiate the parameters for the ranging. Thus, the initiator may send an FTM request (FTM req) to the responder and may receive an acknowledgment from the responder. In addition, the responder may send ranging parameters and/or a first response (FTM_0) and the initiator may acknowledge the response. Further, additional signaling may be exchanged between the initiator and responder to determine a first round trip time (RTT). Thus, the initiator may send another FTM request (FTM req) to the responder and may receive another acknowledgment from the responder. In addition, the responder may send a second and third response (FTM_1 and FTM_2) and the initiator may acknowledge the responses. In addition, the initiator may use timing parameters associated with the second and third responses (e.g., time sent, time received) to calculate the first RTT. At 814, additional signaling may be exchanged between the initiator and responder to determine second and third RTTs. Thus, the initiator may send an FTM request (FTM req) to the responder and may receive an acknowledgment from the responder. In addition, the responder may send a fourth and fifth response (FTM_3 and FTM_4) and the initiator may acknowledge the responses. In addition, the initiator may use timing parameters associated with the third, fourth, and fifth responses (e.g., time sent, time received) to calculate the second and third RTTs. Note additionally, that the signaling shown in FIG. 8B may be performed in conjunction with the signaling described in reference to FIGS. 4 and 5. For example, operations 812-814 may be included at 440a-n (or 540a-n). Turning to FIG. 8C, the signaling shown in FIG. 8C may be used in conjunction with any of the systems, methods, or devices shown in the above Figures, among other systems, devices, and methods. In various embodiments, some of the signaling shown may be performed concurrently, in a different order than shown, or may be omitted. Additional signaling may also be performed as desired. As shown, the signaling may operate as follows. At 832, signaling may be exchanged between initiator (e.g., a client station such as client station 106 described above) and responder (e.g., another client station, such as client station 106) to negotiate the parameters for the ranging. Thus, the initiator may send an FTM request (FTM req) to the responder and may receive an acknowledgment from the responder. At 834, additional signaling may be exchanged between the initiator and responder to determine a first round trip time (RTT). Thus, the responder may send a first and second response (FTM_1 and FTM_2) and the initiator may acknowledge the responses. In addition, the initiator may use timing parameters associated with the first and second responses (e.g., time sent, time received) to calculate the first RTT. At 836, additional signaling may be exchanged between the initiator and responder to determine second and third RTTs. Thus, the initiator may send an FTM request (FTM req) to the responder and may receive an acknowledgment from the responder. In addition, the responder may send a third and fourth response (FTM_3 and FTM_4) and the initiator may acknowledge the responses. In addition, the initiator may use timing parameters associated with the second, third, and fourth responses (e.g., time sent, time received) to calculate the second and third RTTs. Note additionally, that the signaling shown in FIG. 8C may be performed in conjunction with the signaling described in reference to FIGS. 4 and 5. For example, operations 832-836 may be included at 440a-n (or 540a-n). Alternatively, operation 832 may be included at 420 (or 520) and thus may not be required as part of 440a-n (or 540a-n) (e.g., if negotiation regarding the ranging occurs at 420 (or 520), then the negotiation of operation 832 may be skipped (or omitted) and the signaling may proceed with 834 and 836. Turning to FIG. 8D, the signaling shown in FIG. 8D may be used in conjunction with any of the systems, methods, or devices shown in the above Figures, among other systems, devices, and methods. In various embodiments, some of the signaling shown may be performed concurrently, in a different order than shown, or may be omitted. Additional signaling may also be performed as desired. As shown, the signaling may operate as follows. At 842, signaling may be exchanged between initiator (e.g., a client station such as client station 106 described above) and responder (e.g., another client station, such as client station 106) to negotiate the parameters for the ranging. Thus, the initiator may send an FTM request (FTM req) to the responder and may receive an acknowledgment from the responder. In addition, the responder may send a first and second response (FTM_1 and FTM_2) and the initiator may acknowledge the responses. In addition, the initiator may use timing parameters associated with the first and second responses (e.g., time sent, time received) to calculate the first RTT. At 844, additional signaling may be exchanged between the initiator and responder to determine second and third RTTs. Thus, the initiator may send an FTM request (FTM req) to the responder and may receive an acknowledgment from the responder. In addition, the responder may send a third and fourth response (FTM_3 and FTM_4) and the initiator may acknowledge the responses. In addition, the initiator may use timing parameters associated with the second, third, and fourth responses (e.g., time sent, time received) to calculate the second and third RTTs. Note additionally, that the signaling shown in FIG. 8D may be performed in conjunction with the signaling described in reference to FIGS. 4 and 5. For example, operations 842-844 may be included at operations 440a-n (or 540a-n). In PSS mode, the first available time-slot and its associated FTM Parameters for ranging may be specified by a responder based on the updatedFA map and FTM parameters for subsequent time-slots may be negotiated separately on a slot-by-slot basis. In some embodiments, the negotiation may be performed during a DW and/or NW using an SDF or in a specific time-slot using an initial FTM Request and initial FTM Response. FIG. 9 illustrates a frame schedule for partial slot schedule (PSS) scheduling, according to some embodiments. As illustrated, a frame may include a discovery window 930 on a 2.4 GHz channel (e.g., channel 6) and a discovery window 935 on a 5 GHz channel (e.g., channel 149). In addition, each discovery window may be followed by one or more negotiation windows (e.g., negotiation windows 940 and 945. Further, a further availability window may be schedule on the 2.4 GHz channel (e.g., negotiation window 950) and the 5 GHz channel (e.g., negotiation window 955) as well as one or more additional channels (e.g., channels x and y and negotiation windows 957 and 959). A NAN ranging procedure as described above and further discussed below may be schedule during at least a portion of the further availability window on the 2.4 GHz channel (e.g., ranging procedure 960) and at least one of the further availability windows on one of the one or more additional channels (e.g., channel x or y) (e.g., ranging procedure 969), as illustrated. FIGS. 10A-10D illustrate partial slot schedule (PSS) signaling for various values of an as soon as possible (ASAP) parameter, according to some embodiments. FIGS. 10A and 10B illustrate PSS scheduling and timing for a PSS with FTM scheduling for ASAP parameter value of 0. FIGS. 10C and 10D illustrate PSS scheduling and timing for a PSS with FTM scheduling for ASAP parameter value of 1. Turning to FIG. 10A, the signaling shown in FIG. 10A may be used in conjunction with any of the systems, methods, or devices shown in the above Figures, among other devices and methods. In various embodiments, some of the signaling shown may be performed concurrently, in a different order than shown, or may be omitted. Additional signaling may also be performed as desired. As shown, the signaling may operate as follows. At 1002, signaling may be exchanged between initiator (e.g., a client station such as client station 106 described above) and responder (e.g., another client station, such as client station 106) to negotiate the parameters for the ranging. Thus, the initiator may send an FTM request (FTM req) to the responder and may receive an acknowledgment from the responder. In addition, the responder may send ranging parameters and/or a first response (FTM_0) and the initiator may acknowledge the response. At 1004, additional signaling may be exchanged between the initiator and responder to determine a round trip time (RTT). Thus, the initiator may send an FTM request (FTM req) to the responder and may receive an acknowledgment from the responder. In addition, the responder may send a second and third response (FTM_1 and FTM_2) and the initiator may acknowledge the responses. In addition, the initiator may use timing parameters associated with the second and third responses (e.g., time sent, time received) to calculate the RTT. Note additionally, that the signaling shown in FIG. 10A may be performed in conjunction with the signaling described in reference to FIGS. 4 and 5. For example, operations 1002-1004 may be included at operations 440a-n (or 540a-n). Alternatively, 1002 may be included at operation 420 (Or 520) and thus may not be required as part of operations 440a-n (540a-n) (e.g., if negotiation regarding the ranging occurs at 420 (520), then the negotiation of 1002 may be skipped (or omitted) and the signaling may proceed with 1004. Turning to FIG. 10B, the signaling shown in FIG. 10B may be used in conjunction with any of the systems, methods, or devices shown in the above Figures, among other devices and methods. In various embodiments, some of the signaling shown may be performed concurrently, in a different order than shown, or may be omitted. Additional signaling may also be performed as desired. As shown, the signaling may operate as follows. At 1012, signaling may be exchanged between initiator (e.g., a client station such as client station 106 described above) and responder (e.g., another client station, such as client station 106) to negotiate the parameters for the ranging. Thus, the initiator may send an FTM request (FTM req) to the responder and may receive an acknowledgment from the responder. In addition, the responder may send ranging parameters and/or a first response (FTM_0) and the initiator may acknowledge the response. Further, additional signaling may be exchanged between the initiator and responder to determine a round trip time (RTT). Thus, the initiator may send another FTM request (FTM req) to the responder and may receive another acknowledgment from the responder. In addition, the responder may send a second and third response (FTM_1 and FTM_2) and the initiator may acknowledge the responses. In addition, the initiator may use timing parameters associated with the second and third responses (e.g., time sent, time received) to calculate the RTT. Note additionally, that the signaling shown in FIG. 10B may be performed in conjunction with the signaling described in reference to FIGS. 4 and 5. For example, 1012 may be included at operations 440a-n (or 540a-n). Turning to FIG. 10C, the signaling shown in FIG. 10C may be used in conjunction with any of the systems, methods, or devices shown in the above Figures, among other devices and methods. In various embodiments, some of the signaling shown may be performed concurrently, in a different order than shown, or may be omitted. Additional signaling may also be performed as desired. As shown, the signaling may operate as follows. At 1032, signaling may be exchanged between initiator (e.g., a client station such as client station 106 described above) and responder (e.g., another client station, such as client station 106) to negotiate the parameters for the ranging. Thus, the initiator may send an FTM request (FTM req) to the responder and may receive an acknowledgment from the responder. At 1034, additional signaling may be exchanged between the initiator and responder to determine a round trip time (RTT). Thus, the responder may send a first and second response (FTM_1 and FTM_2) and the initiator may acknowledge the responses. In addition, the initiator may use timing parameters associated with the first and second responses (e.g., time sent, time received) to calculate the RTT. Note additionally, that the signaling shown in FIG. 10C may be performed in conjunction with the signaling described in reference to FIGS. 4 and 5. For example, operations 1032-1034 may be included at operations 440a-n (or 540a-n). Alternatively, 1032 may be included at 420 (or 520) and thus may not be required as part of operations 440a-n (540a-n) (e.g., if negotiation regarding the ranging occurs at 420 (520), then the negotiation of 1032 may be skipped (or omitted) and the signaling may proceed with 1034. Turning to FIG. 10D, the signaling shown in FIG. 10D may be used in conjunction with any of the systems, methods, or devices shown in the above Figures, among other devices and methods. In various embodiments, some of the signaling shown may be performed concurrently, in a different order than shown, or may be omitted. Additional signaling may also be performed as desired. As shown, the signaling may operate as follows. At 1042, signaling may be exchanged between initiator (e.g., a client station such as client station 106 described above) and responder (e.g., another client station, such as client station 106) to negotiate the parameters for the ranging. Thus, the initiator may send an FTM request (FTM req) to the responder and may receive an acknowledgment from the responder. In addition, the responder may send a first and second response (FTM_1 and FTM_2) and the initiator may acknowledge the responses. In addition, the initiator may use timing parameters associated with the first and second responses (e.g., time sent, time received) to calculate the RTT. At 1044, additional signaling may be exchanged between the initiator and responder to determine second and third RTTs. Thus, the initiator may send an FTM request (FTM req) to the responder and may receive an acknowledgment from the responder. In addition, the responder may send a third and fourth response (FTM_3 and FTM_4) and the initiator may acknowledge the responses. In addition, the initiator may use timing parameters associated with the second, third, and fourth responses (e.g., time sent, time received) to calculate the second and third RTTs. Note additionally, that the signaling shown in FIG. 10D may be performed in conjunction with the signaling described in reference to FIGS. 4 and 5. For example, operations 1042-1044 may be included at operations 440a-n (or 540a-n). Ranging Termination In some embodiments, ranging may be terminated at any time by the initiator or responder per FTM procedures. In some embodiments, ranging may also be terminated by sending an SDF frame in any time-slot with an updatedFA. Embodiments of the present disclosure may be realized in any of various forms. For example some embodiments may be realized as a computer-implemented method, a computer-readable memory medium, or a computer system. Other embodiments may be realized using one or more custom-designed hardware devices such as ASICs. Other embodiments may be realized using one or more programmable hardware elements such as FPGAs. In some embodiments, a non-transitory computer-readable memory medium may be configured so that it stores program instructions and/or data, where the program instructions, if executed by a computer system, cause the computer system to perform a method, e.g., any of a method embodiments described herein, or, any combination of the method embodiments described herein, or, any subset of any of the method embodiments described herein, or, any combination of such subsets. In some embodiments, a wireless device may be configured to include a processor (or a set of processors) and a memory medium, where the memory medium stores program instructions, where the processor is configured to read and execute the program instructions from the memory medium, where the program instructions are executable to cause the wireless device to implement any of the various method embodiments described herein (or, any combination of the method embodiments described herein, or, any subset of any of the method embodiments described herein, or, any combination of such subsets). The device may be realized in any of various forms. Although the embodiments above have been described in considerable detail, numerous variations and modifications will become apparent to those skilled in the art once the above disclosure is fully appreciated. It is intended that the following claims be interpreted to embrace all such variations and modifications. 16433198 apple inc. USA B2 Utility Patent Grant (with pre-grant publication) issued on or after January 2, 2001. Open Apr 27th, 2022 08:45AM Apr 27th, 2022 08:45AM Technology Technology Hardware & Equipment Information Technology
nasdaq:aapl Apple Apr 26th, 2022 12:00AM Aug 23rd, 2021 12:00AM https://www.uspto.gov?id=USD0949824-20220426 Audio listening system D949824 The ornamental design for an audio listening system, as shown and described. 1 FIG. 1 is a front perspective view of an audio listening system showing the claimed design; FIG. 2 is a rear view thereof; FIG. 3 is a front view thereof; FIG. 4 is a left side perspective view thereof; FIG. 5 is a left side view thereof; and, FIG. 6 is a right side view thereof. The broken lines in the figures show portions of the audio listening system that form no part of the claimed design. 29804838 apple inc. USA S1 Design Patent Open D14/205 15 Apr 27th, 2022 08:45AM Apr 27th, 2022 08:45AM Technology Technology Hardware & Equipment Information Technology
nasdaq:aapl Apple Apr 26th, 2022 12:00AM Mar 3rd, 2021 12:00AM https://www.uspto.gov?id=US11316968-20220426 Displaying relevant user interface objects Techniques for displaying relevant user interface objects when a device is placed into viewing position are disclosed. The device can update its display in response to a user approaching a vehicle. Display updates can be based on an arrangement of user interface information for unlocking the vehicle. 11316968 1. An electronic device comprising: a display; one or more processors; and memory storing one or more programs configured to be executed by the one or more processors, the one or more programs including instructions for: while a user is using the electronic device, detecting that the user is approaching a vehicle associated with the user; in response to detecting that the user is approaching the vehicle, displaying on the display, a first icon for unlocking the vehicle; and in response to unlocking the vehicle via the first icon, displaying on the display of the electronic device, a second icon that corresponds to an action that the user has repeatedly taken while in the vehicle at previous times when the user has been using the vehicle. 2. The electronic device of claim 1, the one or more programs further including instructions for: while displaying the first icon, detecting an input; and in response to detecting the input, initiating a process for unlocking the vehicle. 3. The electronic device of claim 2, wherein the input is directed to the first icon. 4. The electronic device of claim 2, wherein initiating the process for unlocking the vehicle comprises unlocking the vehicle. 5. The electronic device of claim 2, wherein initiating the process for unlocking the vehicle comprises launching an application for performing the process for unlocking the vehicle. 6. The electronic device of claim 1, the one or more programs further including instructions for: prior to displaying the first icon, displaying a plurality of icons on the display of the electronic device, wherein the plurality of icons does not include the first icon; and wherein displaying the first icon includes updating the displayed plurality of icons to include the first icon. 7. The electronic device of claim 6, wherein the plurality of icons is selected from a larger plurality of icons available for display in response to a determination that the plurality of icons have a respective relevance. 8. The electronic device of claim 7, wherein the larger plurality of icons available for display includes the first icon. 9. The electronic device of claim 6, wherein the displayed plurality of icons is updated to include the first icon in response to determining an increased relevance of an application for unlocking the vehicle. 10. The electronic device of claim 6, wherein updating the displayed plurality of icons to include the first icon comprises replacing display of a respective one of the plurality of icons with the first icon. 11. The electronic device of claim 1, the one or more programs further including instructions for: determining a relevance of the first icon using a location of the electronic device and a location of the vehicle. 12. The electronic device of claim 11, wherein the relevance of the first icon increases as a distance between the location of the electronic device and the location of the vehicle decreases. 13. The electronic device of claim 11, wherein the relevance of the first icon is determined using an identification of the vehicle. 14. The electronic device of claim 11, wherein the relevance of the first icon is determined using a relevance algorithm. 15. The electronic device of claim 1, the one or more programs further including instructions for: determining an increased relevance of the first icon in response to detecting that the user is approaching the vehicle. 16. The electronic device of claim 15, the one or more programs further including instructions for: increasing a displayed size of the first icon in response to determining an increased relevance of the first icon. 17. The electronic device of claim 1, wherein the first icon is associated with a first application for unlocking the vehicle, and the second icon is associated with a second application different from the first application. 18. The electronic device of claim 1, wherein the electronic device is a wearable device. 19. The electronic device of claim 1, wherein the first icon is displayed in response to detecting an input from a movement sensor based on a movement of the electronic device. 20. The electronic device of claim 1, wherein the second icon corresponds to an application that the user has repeatedly launched while in the vehicle at previous times when the user has been using the vehicle. 21. A method, comprising: at an electronic device including a display: while a user is using the electronic device, detecting that the user is approaching a vehicle associated with the user; in response to detecting that the user is approaching the vehicle, displaying on the display, a first icon for unlocking the vehicle; and in response to unlocking the vehicle via the first icon, displaying on the display of the electronic device, a second icon that corresponds to an action that the user has repeatedly taken while in the vehicle at previous times when the user has been using the vehicle. 22. A non-transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of an electronic device with a display, the one or more programs including instructions for: while a user is using the electronic device, detecting that the user is approaching a vehicle associated with the user; in response to detecting that the user is approaching the vehicle, displaying on the display, a first icon for unlocking the vehicle; and in response to unlocking the vehicle via the first icon, displaying on the display of the electronic device, a second icon that corresponds to an action that the user has repeatedly taken while in the vehicle at previous times when the user has been using the vehicle. 23. The non-transitory computer-readable storage medium of claim 22, the one or more programs further including instructions for: while displaying the first icon, detecting an input; and in response to detecting the input, initiating a process for unlocking the vehicle. 24. The non-transitory computer-readable storage medium of claim 23, wherein the input is directed to the first icon. 25. The non-transitory computer-readable storage medium of claim 23, wherein initiating the process for unlocking the vehicle comprises unlocking the vehicle. 26. The non-transitory computer-readable storage medium of claim 23, wherein initiating the process for unlocking the vehicle comprises launching an application for performing the process for unlocking the vehicle. 27. The non-transitory computer-readable storage medium of claim 22, the one or more programs further including instructions for: prior to displaying the first icon, displaying a plurality of icons on the display of the electronic device, wherein the plurality of icons does not include the first icon; and wherein displaying the first icon includes updating the displayed plurality of icons to include the first icon. 28. The non-transitory computer-readable storage medium of claim 22, the one or more programs further including instructions for: determining a relevance of the first icon using a location of the electronic device and a location of the vehicle. 29. The non-transitory computer-readable storage medium of claim 22, the one or more programs further including instructions for: determining an increased relevance of the first icon in response to detecting that the user is approaching the vehicle. 30. The non-transitory computer-readable storage medium of claim 29, the one or more programs further including instructions for: increasing a displayed size of the first icon in response to determining an increased relevance of the first icon. 31. The non-transitory computer-readable storage medium of claim 22, wherein the first icon is associated with a first application for unlocking the vehicle, and the second icon is associated with a second application different from the first application. 32. The non-transitory computer-readable storage medium of claim 22, wherein the electronic device is a wearable device. 33. The non-transitory computer-readable storage medium of claim 22, wherein the first icon is displayed in response to detecting an input from a movement sensor based on a movement of the electronic device. 34. The non-transitory computer-readable storage medium of claim 22, wherein the second icon corresponds to an application that the user has repeatedly launched while in the vehicle at previous times when the user has been using the vehicle. 35. The method of claim 21, further comprising: while displaying the first icon, detecting an input; and in response to detecting the input, initiating a process for unlocking the vehicle. 36. The method of claim 35, wherein the input is directed to the first icon. 37. The method of claim 35, wherein initiating the process for unlocking the vehicle comprises unlocking the vehicle. 38. The method of claim 35, wherein initiating the process for unlocking the vehicle comprises launching an application for performing the process for unlocking the vehicle. 39. The method of claim 21, further comprising: prior to displaying the first icon, displaying a plurality of icons on the display of the electronic device, wherein the plurality of icons does not include the first icon; and wherein displaying the first icon includes updating the displayed plurality of icons to include the first icon. 40. The method of claim 21, further comprising: determining a relevance of the first icon using a location of the electronic device and a location of the vehicle. 41. The method of claim 21, further comprising: determining an increased relevance of the first icon in response to detecting that the user is approaching the vehicle. 42. The method of claim 41, further comprising: increasing a displayed size of the first icon in response to determining an increased relevance of the first icon. 43. The method of claim 21, wherein the first icon is associated with a first application for unlocking the vehicle, and the second icon is associated with a second application different from the first application. 44. The method of claim 21, wherein the electronic device is a wearable device. 45. The method of claim 21, wherein the first icon is displayed in response to detecting an input from a movement sensor based on a movement of the electronic device. 46. The method of claim 21, wherein the second icon corresponds to an application that the user has repeatedly launched while in the vehicle at previous times when the user has been using the vehicle. 46 CROSS-REFERENCE TO RELATED APPLICATIONS This application is a continuation of U.S. application Ser. No. 16/267,817, filed on Feb. 5, 2019, titled “DISPLAYING RELEVANT USER INTERFACE OBJECTS,” which is a continuation of U.S. application Ser. No. 15/033,551, filed on Apr. 29, 2016, titled “DISPLAYING RELEVANT USER INTERFACE OBJECTS,” which is a national stage application under 35 U.S.C. § 371 of International Patent Application No. PCT/US2013/067634, filed on Oct. 30, 2013, titled “DISPLAYING RELEVANT USER INTERFACE OBJECTS,” the contents of each of which are hereby incorporated by reference. FIELD The disclosed embodiments relate generally to user interfaces of electronic devices. BACKGROUND Advanced personal electronic devices can have small form factors. Exemplary personal electronic devices include but are not limited to tablets and smart phones. Uses of such personal electronic devices involve presentation and manipulation of user interface objects on display screens that are designed to be small to complement the personal electronic devices. Exemplary user interface objects include digital images, video, text, icons, control elements such as buttons, and other graphics. As used here, the term icon refers to an image that is used to represent and to launch an application, consistent with its ordinary meaning in the art. In addition, a “widget,” which is used in the art to refer to a simplified view of an application, constitutes an icon, for purposes of this disclosure. Existing user interfaces on reduced-size personal electronic devices can be inefficient, as they may require multiple manipulations by a user before appropriate information is presented. SUMMARY Techniques for presenting user interface objects on a personal electronics device are disclosed. DESCRIPTION OF THE FIGURES FIG. 1 illustrates an exemplary personal electronic device. FIG. 2 illustrates an exemplary user interface. FIG. 3 illustrates an exemplary user interface. FIG. 4 illustrates an exemplary logical structure of a user interface. FIG. 5 illustrates an exemplary user interface. FIG. 6 illustrates an exemplary user interface. FIG. 7 illustrates an exemplary computing system. FIG. 8 illustrates an exemplary user interface. FIG. 9 illustrates an exemplary user interface. FIG. 10 illustrates an exemplary user interface. FIG. 11 illustrates an exemplary user interface. FIG. 12 illustrates an exemplary user interface. FIG. 13 illustrates an exemplary user interface. FIG. 14 illustrates an exemplary user interface. FIG. 15 illustrates an exemplary user interface. FIG. 16 illustrates an exemplary user interface. FIG. 17 illustrates an exemplary user interface. FIG. 18 illustrates an exemplary process for displaying user interface objects. DETAILED DESCRIPTION In the following description of the disclosure and examples, reference is made to the accompanying drawings in which it is shown by way of illustration specific examples that can be practiced. It is to be understood that other examples can be practiced and structural changes can be made without departing from the scope of the disclosure. FIG. 1 illustrates exemplary personal electronic device 100 (hereafter device 100). In the illustrated example, device 100 includes body 102. Device 100 can have touch-sensitive display screen (hereafter touchscreen) 104. Touchscreen 104 can include a display device, such as a liquid crystal display (LCD), light-emitting diode (LED) display, organic light-emitting diode (OLED) display, or the like, positioned partially or fully behind or in front of a touch sensor panel implemented using any desired touch sensing technology, such as mutual-capacitance touch sensing, self-capacitance touch sensing, resistive touch sensing, projection scan touch sensing, or the like. Touchscreen 104 can allow a user to perform various functions by touching over hovering near the touch sensor panel using one or more fingers or other object. In some embodiments, device 100 can have one or more input mechanisms 106 and 108. Input mechanisms 106 and 108, if included, can be touch-sensitive. Examples of touch-sensitive input mechanisms include touch-sensitive buttons and touch-sensitive surfaces. Input mechanisms 106 and 108, if included, can be physical. Examples of physical input mechanisms include push buttons and rotatable mechanisms. Body 102, which can include a bezel, can have predetermined regions on the bezel that act as input mechanisms. In some embodiments, device 100 can have an attachment mechanism. Such an attachment mechanism, if included, can permit attachment of device 100 with clothing, jewelry, and other wearable accessories, for example. For example, the attachment mechanism can attach to hats, eyewear, earrings, necklaces, shirts, jackets, bracelets, watch straps, chains, trousers, belts, shoes, purses, backpacks, so forth. In some embodiments, device 100 can have one or more pressure sensors (not shown) for detecting a force or pressure applied to touchscreen 104. The force or pressure applied to touchscreen 104 can be used as an input to device 100 to perform any desired operation, such as making a selection, entering or exiting a menu, causing the display of additional options/actions, or the like. Different operations can be performed based on the amount of force or pressure being applied to touchscreen 104. The one or more pressure sensors can further be used to determine a position that the force is being applied to touchscreen 104. 1. Displaying Relevant User Interface Objects FIG. 2 illustrates exemplary device 100 worn by user 201, who is walking towards his vehicle 202. As user 201 moves device 100 into a viewing position, device 100 displays a user interface screen 203 on touchscreen 104, automatically. In some embodiments, the display elements of touchscreen 104 are inactive until user 201 moves device 100 into viewing position, meaning that the display elements of touchscreen 104 are off or appear to be off. In some embodiments, device 100 can rotate the displayed contents of touchscreen 104 (e.g., between landscape and portrait modes) so that the displayed information is in a proper viewing orientation, regardless of whether device 100 is held upwards, downwards, or sideways by user 201. User interface screen 203 includes user interface objects that device 100 has determined to be the most relevant to the user this moment. In particular, screen 203 includes an icon 204 for unlocking vehicle 202, which is useful to user 201 as he approaches his vehicle. Screen 203 also includes map icon 205 for accessing traffic information, which can be useful to user 201 as he begins his trip. Screen 203 also includes icon 206 referencing an upcoming calendar event, which can be useful in providing destination information. Sizes of displayed icons can be relative to their relevance. On screen 203, icon 204 is larger than icons 205 and 206 because device 100 has concluded that the unlocking of vehicle 202, provided via icon 204, is more relevant. This user interface presentation is notable in that it prioritizes and displays a manageable subset of icons to user 201, even if many more user interface objects are available for display. Also, this user interface is made available to user 201 without any user interface navigation input from the user, other than the raising of his arm (e.g., without requiring user 201 to push a power-on or equivalent button). In this way, device 100 reduces the amount of user input required to invoke an appropriate user interface action. This benefit is non-trivial, particularly because device 100 has a relatively small display screen size, as compared with smart phones and other electronic devices, which can impede a user's navigation of a larger user interface environment. It is possible for the number of relevant user interface objects in a given situation to exceed the number that can be reasonably displayed together on touchscreen 104, such as three as shown in FIG. 2. When this is the case, device 100 can prioritize the most relevant icons—as determined by a computer-based relevance algorithm on device 100—for initial display. In some embodiments, a user can bring the remaining relevant icons onto the display using input mechanisms 106 or 108. In some embodiments, a user can bring the remaining relevant icons onto the display using touchscreen 104, such as by swiping touchscreen 104 with a touch object. FIG. 3 illustrates the display of relevant icons over multiple user interface screens. In the illustrated example, user interface screen 301 was displayed on device 100 in response to an upward movement of the device. Screen 301 includes icons 302-304 representing relevant applications, which can be icons 203-205 (FIG. 2) in some examples. In response to a rotation of input mechanism 108 in direction 306, user interface screen 311 becomes displayed on device 100. Screen 311 can show a number of additional relevant icons 312-314 that are less relevant than those shown in screen 301. In a response to a further rotation of input mechanism 108 in the same direction 306, device 100 can show user interface screen 321. Screen 321 can include another set of relevant icons 322-324 that are less relevant than those shown in screen 311, which are in turn less relevant than those in screen 301. Input mechanism 108 can be a rotatable crown. In this way, a user can navigate between multiple sets of relevant user interface objects (e.g., icons) on device 100. A user can launch an application that corresponds to a displayed icon by touching (e.g., via a finger tap) the displayed icon. As used here, the launching of an application means that the application runs in the foreground of device 100 and is shown on-screen. FIG. 4 illustrates this aspect. In the illustrated example, user interface screen 401 was displayed on device 100 in response to a movement of the device into viewing position. Screen 401 includes icon 402 representing a messaging application (e.g., supporting Short Message Service (SMS)) having five unread messages, as well as icons 403 and 404 representing other applications. In response to a tap on icon 402 from a touch object (e.g., finger 405), device 100 launches the corresponding messaging application and displays unread messages 412 on user interface screen 411. Under some usage conditions, a user may wish to navigate from the messaging application to another relevant application. For instance, the user may wish to navigate to the music and map applications previously represented by icons 403 and 404 on screen 401. Device 100 can permit navigation between these applications directly, without first returning to screen 401. In particular, a rotation of input mechanism 108 in direction 414 while screen 411 is displayed causes device 100 to display the music player represented by icon 403 on screen 421. Screen 421 can include music playback controls 423. A further rotation of input mechanism 108 in direction 414 while screen 421 is displayed causes device 100 to display the map application represented by icon 404 on screen 431. Screen 431 can include traffic information 432. In some embodiments, screens 411, 421, and 431 include visual aids, such as paging dots 415, 425, and 435, respectively, that identify the relative position of the currently displayed application along the sequence of applications accessible via input mechanism 108. Other visual aids, such as scroll bars and screen-to-screen transitions, can also be used to aid the user's identification of the currently displayed user interface screen in relation to the larger set of available user interface screens. While the exemplary user interface screens depicted in FIGS. 2-4 are primarily concerned with the efficient display of relevant user interface objects, it should be noted that device 100 can include many more user interface objects that should be accessible to a user, even if their relevance in the moment is not readily discernible. For example, a user may wish to play a game impulsively. Device 100 can permit user navigation beyond relevant user interface objects to other user interface objects. FIG. 5 illustrates this aspect. In FIG. 5, user interface screen 501 is displayed on device 100 in response to movement of the device into viewing position. Screen 501 includes icons 502-504 representing relevant applications, which can be icons 203-205 (FIG. 2) in some examples. In the illustrated example, device 100 has determined that only three user interface objects (i.e., icons 502-504) are relevant at the moment. Thus, in response to a rotation of input mechanism 108 in direction 505, device 100 displays user interface screen 511 having other user interface objects available for user selection on device 100. The icons shown on screen 511 can be a user's favorite icons, meaning that the icons of screen 511 are a predetermined subset of user interface objects available on device 100. In response to a further rotation of input mechanism 108 in direction 505, device 100 displays user interface screen 521, which includes icons that represent all of the available applications on device 100. Because the size of the displayed icons on screen 521 may be too small for user navigation, in response to a further rotation of input mechanism 108 in direction 505, device 100 displays screen 531, which has the effect of zooming into a subset of the icons from screen 521 so that those icons are displayed in larger size for user interaction. The user interface navigation described with reference to FIG. 5 can be logically organized according to logical structure 600 depicted in FIG. 6. In the illustrated example of FIG. 6, x-axis 601 and y-axis 602 form a plane co-planar with the touchscreen screen surface of device 100 (FIG. 1), and z-axis 603 is perpendicular to the x/y-plane formed by axes 601 and 602. Plane 604, in one example, corresponds to user interface screen 501 (FIG. 5), while plane 605 corresponds to user interface screen 511 (FIG. 5), and plane 607 corresponds to user interface screens 521 and 531 (FIG. 5). More specifically, screen 521 (FIG. 5) can correspond to a viewpoint of the entire content of plane 607, while screen 531 (FIG. 5) can correspond to a zoomed in viewpoint (i.e., an enlarged subset) of the content of plane 607. In another example, planes 604, 607, 608 can correspond to user interface screens 301, 311, and 321 of FIG. 3, respectively. Movement of an input mechanism can be used to select a particular plane of information (i.e., screen of icons) for display on device 100. For example, rotation of input mechanism 108 can cause different screens of icons to be displayed on device 100 similar to the fashion depicted in FIG. 5, for example. 2. Determining Relevant User Interface Objects Consistent with its plain meaning, the phrase “relevant icons” is used here to refer to user interface icons that bear upon or properly apply to the matter that is at hand. In the example of FIG. 2, an icon for unlocking a vehicle application is relevant as a user draws near his car, because the user is likely to want to drive the car. Device 100 can determine relevance using computer instructions (e.g., algorithms) that account for different inputs, including sensor input, application data, and operating system data. FIG. 7 depicts exemplary computing system 700 that, in some embodiments, form device 100. Computing 700 includes components for determining and displaying relevant user interface objects. In the illustrated example, computing system 700 includes an I/O section 704 that can be operatively coupled (connected) with various sensors, such as GPS sensor 720, accelerometer 722, directional sensor 724, gyroscope 726, light sensor 728, and/or a combination thereof. I/O section 704 also can be connected with communication unit 718, for receiving application and operating system data, over Wi-Fi, Bluetooth™, near-field communication (“NFC”), cellular and other wireless communication techniques. In addition, computing system 700 can have bus 702 that connects I/O section 704 together with one or more computer processors 706 and memory section 708. Memory section 708 can contain computer-executable instructions (e.g., representing algorithms) and/or data for determining and displaying relevant user interface objects. One or more of these components can be part of an integrated chip or a so-called system-on-a-chip. In addition, I/O section 704 can be connected to input mechanism 714. I/O section 704 can be connected to one or more input buttons 716. I/O section 704 can be connected to display 710, which can have touch-sensitive component 712 and, optionally, touch-pressure sensitive component 713. The sensors and communication units of computing system 700 can provide information for identifying relevant user interface objects. For example, GPS sensor 720 can determine a user's location and movement while communication unit 718 can receive information about the location and identity of a nearby vehicle (e.g., vehicle 202 in FIG. 2). Accelerometer 722, directional sensor 724, and gyroscope 726 can further detect device movement. Optionally, the outputs of GPS sensor 720, accelerometer 722, directional sensor 724, and/or gyroscope 726 can be interpreted by motion processor 730. Processors 706 and computer-executable instructions in memory section 708 can use some or all of this information to determine that the user is approaching his vehicle. Processors 706 and instructions in memory 708 can also determine, based on application data and/or operating system data (including meta-data) stored in memory 708, that an application for interacting with the user's vehicle is installed. In this way, the relevance algorithms of device 100 can conclude that the vehicle interaction application is relevant to the user in the moment. In addition, device 100 can also conclude, based on the same data, that a map application would also be relevant to the user. Communication unit 718 can also receive other information that affects the relevance of user interface objects. For example, the communication unit can detect nearby devices that are identical or similar, such as other wearable devices of the same design. The communication unit can also detect non-identical units that are running the same operating system as device 100, such as smart phones and tablets of the same brand. The communication unit can also identify dissimilar devices that support communication over a common protocol. These protocols can include wireless protocols such as Wi-Fi, Bluetooth™, NFC, and the like. These protocols can also be software-based service protocols, such as operating environment service protocols (Apple™ AirPlay™ and AirDrop™), home automation service protocols (e.g., those offered by Phillips™ Lighting and Nest™), authentication service protocols (e.g., airport clearance and metro fares), to point of sale service protocols (e.g., at grocery checkouts), for example. The algorithms used by device 100 to identify relevant user interface objects can account for these inputs provided by the communication unit 718. Furthermore, communication unit 718 can receive application and operating system data that inform relevance. For example, a messaging application can receive an incoming message via SMS or Wi-Fi service, and thereby become relevant. As another example, the relevance algorithms of device 100 can use calendar data and the cellular system time to determine that an event reminder is relevant. Furthermore, the relevance algorithms of device 100 can consider the content of application and operating system data in determining relevance. For example, the algorithms can consider an incoming message that contains a reference to a specific time (e.g., “let's meet at 3:00 p”) to be increasingly relevant as that time (i.e., 3:00 pm) approaches. In some embodiments, user interface objects can be relevant in groups. That is, application data (including meta-data) can specify that whenever user interface object A is relevant, that user interface object B is also relevant. For example, a music application can be tied to a vehicle interaction application in this way, because drivers typically enjoy music. A map application can also be tied to a vehicle interaction application in this way, because drivers typically desire traffic and/or routing information. In some embodiments, relevance algorithms used by device 100 can be adaptive, meaning that the outcome of the algorithms can change based on historical user behavior. For example, the algorithms can recognize a user's work commute based on the user's driving pattern during weekday mornings. In this way, device 100 can prioritize specific traffic information for display in the morning. As another example, if a user repeatedly launches one particular radio application over other available radio applications during his commute, device 100 can identify that radio application as being more relevant, and display its icon whenever the user unlocks his car. In some embodiments, computing system 700 can include biometric sensors such as health-related sensors such as photoplethysmograph (PPG) sensors, electrocardiography (ECG) sensors, and/or galvanic skin response (GSR) sensors. Device 100 can receive input from one or more of these sensors to provide health-related information. For example, device 100 can use PPG sensor information to alert a user to abnormal respiratory rate, blood pressure, and/or oxygen saturation. As another example, device 100 can use an ECG sensor to alert a user to irregular heartbeats. As yet another example, device 100 can use a GSR sensor to detect a user's skin moisture indicative of sweating, and prioritize a thermostat application for display on device 100. These sensors can also be used to facilitate biometric identification and authentication of a user. The sensors of computing system 700 can detect when the system (e.g., device 100) is placed into a viewing position. For example, accelerometer 722 and/or motion sensor 730 can detect when computing system 700 is raised, lowered, and shaken. These sensors can also detect wrist rotation forward and backward. In some embodiments, the raising of computing device 700 is interpreted as a placement of the device into viewing position. In some embodiment, the raising and rotation of computing device 700 is interpreted as a placement of the device into viewing position. In some embodiments, the time duration between the raising and lowering of computing device 700 is interpreted as a placement of the device into viewing position. Algorithms used by device 100 to identify relevant user interface objects for display can use one or more of the above-described aspects of the device (e.g., computing system 700). That is, the algorithms can consider a combination of inputs in determining relevance, including location, movement (including orientation, direction, tilt, acceleration, and velocity), ambient conditions (including light, time, temperature, user's health status), application data (including incoming calls, incoming messages, upcoming calendar events). For example, device 100 can determine that when it is moving at a velocity that exceeds a threshold (e.g., 10 mph, 20 mph, 25 mph, 30 mph, 40 mph, 50 mph, 55 mph, 60 mph, 65 mph, so forth), the user of the device is commuting, and that icons corresponding to navigational applications have higher relevance. In this situation, device 100 can also determine that icons representing in-vehicle entertainment applications are relevant, if an available in-vehicle device is in communication with the communication unit of device 100. As another example, device 100 can determine that when its biometric sensors and motion sensors detect movement indicative of exercising, icons representing health-related applications have higher relevance. As another example, device 100 can determine that a calendar event that is coming up in a particular amount of time (e.g., 15 minutes, 30 minutes, 1 hour, 1 day, 1 week, so forth) is of higher relevance. Optionally, device 100 can factor in other variables, such as the distance between the device's current location and the event's location, as well as the current weather, in determining the relevance of an event. That is, device 100 may determine that a nearby event that is upcoming in 15 minutes has less relevance than an event that is upcoming in an hour but is 30 miles away, for example. 3. Exemplary User Interactions A user can interact with the user interface of device 100. These interactions can include shortcuts for invoking applications features. This aspect is discussed with reference to FIGS. 8-9. In the example of FIG. 8, device 100 had just received an incoming SMS message, and had provided haptic feedback to the user. In response to the haptic feedback, the user raises device 100 into viewing position, thereby causing device 100 to display user interface screen 801. Screen 801 includes icons 802-804 representing applications that it has determined as being relevant to the user at the moment. Icon 802 represents the unread SMS message. Icon 803 represents an upcoming calendar event. Icon 804 represents available traffic information. Icon 802 is displayed in large format because the SMS message, which was recently received, ranks highest in relevance. Because messaging icon 802 has the highest relevance, when the user rotates input mechanism 108 in direction 805, device 100 launches the corresponding messaging application and displays unread SMS message 812 on user interface screen 811. In response to a further rotation of input mechanism 108 in direction 805, device 100 displays calendar event 822 in the calendar application represented by icon 803 on user interface screen 821. In response to a further rotation of input mechanism 108 in direction 805, device 100 displays traffic information provided by the map application (corresponding to icon 804) on user interface screen 831. From screen 811, a user may tap on SMS message 812 to invoke user interface screen 901, shown in FIG. 9. Turning to FIG. 9, screen 901 includes icon 902 for responding to SMS message 812. Screen 901 also includes icon 903 for creating an alarm at 3 o'clock in the afternoon as suggested by SMS message 812. Similarly, when screen 821 (FIG. 8) is displayed, the user may tap on calendar event 822 to invoke user interface screen 911, shown in FIG. 9. Screen 911 includes icon 912 for messaging an event attendee (e.g., Larry). Screen 911 also includes icon 913 for obtaining navigation to the event location. Finally, when screen 831 (FIG. 8) is displayed, a user may tap on map 832 to invoke user interface screen 921, shown in FIG. 9. Screen 921 includes icon 922 for setting a navigation waypoint and icon 923 for obtaining turn-by-turn navigation instructions. In some embodiments, device 100 can distinguish between short taps and long taps on touch-screen 104 (FIG. 1), and invoke screen 901 only after a long-tap on screen 811 (FIG. 8), for example. For purposes of this disclosure, a short tap refers to a brief touch on touchscreen 104 (FIG. 1) followed by a release of the touch. A long tap refers to a longer touch on touchscreen 104 (FIG. 1) before touch release. Device 100 can consider touches exceeding a predetermined duration to be long taps (and touches of shorter duration to be short taps). In some embodiments, device 100 can distinguish between the level of pressure on touchscreen 104. That is, device 100 can detect the intensity of a touch object (e.g., a user's finger) on touchscreen 104. Thus, device 100 can invoke screen 901 only after a user taps on screen 811 (FIG. 8.) with sufficient pressure. In some embodiments, device 100 can distinguish between brief glances and longer stares at touchscreen 104 (FIG. 1). A brief glance can be characterized by having a short duration between the raising of the device into viewing position and the subsequent lowering of the device. A longer stare can be characterized by a period of relative steadiness of the device in the viewing position. Device 100 can respond to brief glances and longer stares differently. This aspect is illustrated by FIG. 10. In the example of FIG. 10, user interface screen 1001 was displayed in response to a user's movement of device 100 into viewing position. However, instead of displaying multiple relevant user interface objects, user interface screen 1001 emphasizes the display of an unread SMS message 1002 from a contact, because message 1002 had arrived immediately before device 100 was raised into viewing position. If the user maintains device 100 in viewing position exceeding a predetermined time duration, device 100 replaces screen 1001 with user interface screen 1011, which shows multiple icons representing relevant user interface objects available on device 100. From screen 1011, the user can tap on icon 1012 using finger 1013 to return to SMS message 1002. In this way, device 100 permits a user to briefly glance at an incoming message. 4. Exemplary User Interfaces FIGS. 11-16 illustrate exemplary user interfaces that device 100 can display, based on relevance, over the course of a day. In FIG. 11, device 100 determines that the user has recently awakened, and displays an appropriate greeting 1102 stating “good morning”. Device 100 can make this determination based on the time of day, the user's interaction with an alarm clock application (e.g., user may have just turned off an alarm), and/or movement of the device that indicate the user is walking after a sedentary period, for example. Device 100 can rank greeting 1102 as the most relevant icon to be displayed to a user as he wakes up. Because of its high relevance, greeting 1102 is emphasized on user interface screen 1101, meaning that greeting 1102 can be largest icon displayed, or the only icon displayed. Note, however, that when greeting 1102 is the only icon displayed, other non-icon user interface elements (such as the current time) can still be displayed on-screen. User interface screen 1111 depicts another exemplary user interface that device 100 can display as its user wakes up. Screen 1111 includes icon 1112 indicating the current time. Icon 1123 can have circumferential outline 1113 indicating the time remaining in snooze. Optionally, icon 1112 can have a background that indicates the current weather, for example, with blue representing temperate weather and gray representing inclement weather. Screen 1112 can also include icon 1115 indicating unread messages that the user should attend to. FIG. 12 illustrates user interface screen 1201, which can show additional relevant user interface objects after a user wakes up. Screen 1201 includes relevant icons 1202-1204. Icon 1202 can correspond to a health application and indicate sleep information, such as the duration of sleep by the user. Icon 1203 can correspond to calendar information, such as the remaining time before a next calendar event. Icon 1204 can correspond to additional calendar information, such as all-day events. User interface screen 1211 depicts additional relevant user interface objects that device 100 can display after a user wakes up. Screen 1211 includes relevant icons 1212 and 1213. Icon 1212 can correspond to a weather application indicating the weather at the device's present location. Optionally, icon 1212 can indicate the weather at a location that the user historically travels to in the morning, such as the weather at the user's work location. In addition, icon 1213 can indicate that the user should begin his morning commute to work in 45 minutes. Device 100 can make this determination based on the first event in today's calendar, the user's usual travel destination on weekday mornings, and the estimated time of travel to that destination based on distance and traffic information, for example. User interface screen 1221 depicts additional relevant user interface objects that device 100 can display later in the morning. Exemplary user interface screen 1121 includes relevant icons 1222-1224. Icon 1222, which indicates weather condition, can display the same information that was displayed earlier by icon 1212. However, while icon 1212 was the most relevant icon on screen 1211, its relevance in screen 1221 is superseded by traffic icon 1223. Traffic icon 1223 indicates a traffic alert and is displayed as the largest icon because device 100 has determined that information about an accident along the user's typical morning commute is highly relevant at the moment. Screen 1221 also includes icon 1224 indicating that the user should begin his commute to work in 10 minutes, rather than the 45 minute indication given earlier by icon 1213, in view of traffic information (caused by the accident) received by device 100. Turning to FIG. 13, screen 1301 depicts icon 1302 for unlocking the user's vehicle as he approaches his vehicle. Device 100 can display icon 1302 based on decreasing distance between device 100 and his nearby vehicle. Optionally, screen 1301 can include additional relevant icons, such as those discussed with respect to FIG. 1. While the user is in his car, device 100 can display user interface screen 1311 if it is raised into viewing position. Screen 1311 includes information about the estimate time to arrival (“ETA”) to work (i.e., icon 1312), the time to his next calendared meeting (i.e., icon 1313), and the music player (i.e., as represented by icon 1314), which are relevant to the user as he is en route to work. Device 100 can determine that the user is driving based on GPS movement and/or by communication with an in-car telematics system (e.g., through Bluetooth™ or a cable connection). Device 100 can determine that the user is driving to his work based on historical information about the user's commute pattern. As the user nears his workplace, the estimated time to arrival may become less relevant, causing the information to be displayed with less emphasis. For example, in user interface screen 1321, music icon 1322 is displayed in larger format than ETA icon 1224. Icon 1323 can continue to display the time to the next calendared meeting as the information continues to be highly relevant. Device 100 can mark the reminder as highly relevant if the meeting is off-site (i.e., physically far from the user's work location), based on GPS sensor and calendar information. Turning to FIG. 14, later in the day, the user of device 100 can visit a store such as a coffee shop. On screen 1401, device 100 can display an electronic-payment icon 1412 that permits the user to authorize a purchase at the coffee shop. Device 100 can determine its proximity to the coffee shop based on GPS information and application data provided by a map application or a third-party application, such as a Starbucks™ application. Device 100 can also determine its proximity to the coffee shop based on wireless communication with the store's point-of-sale system, such as through near-field communication with a payment reader. In addition, on screen 1401, device 100 can display icon 1403 indicating the proximity of a contact (e.g., a friend) at the coffee shop. On screen 1411, device 100 can display icon 1412 indicating a new incoming message, icon 1413 counting down to an upcoming meeting, and icon 1414 suggesting that the user should take the stairs to the meeting for additional exercise. Device 100 can remind a user if he is late to a meeting. For example, on screen 1421, device 100 can display icon 1422 alerting the user that the user is eight minutes late to a calendared meeting, and icons 1423 and 1424 alerting the user to new incoming messages, some of which may have been triggered by his absence at the meeting. Turning to FIG. 15, device 100 can display information relevant as the workday draws to a close. On user interface screen 1501, device 100 can display the user's ETA to home (i.e., icon 1502) and his spouse's ETA to home (i.e., icon 1503). On user interface screen 1511, device 100 can continue to display the user's ETA to home (i.e., icon 1512), a music application to changing the music in his vehicle (i.e., icon 1513), and a stress level indicator (i.e., icon 1514). Device 100 can calculate the user's stress level based on sensor input including, for example, PPG, ECG, and GSR sensor readings. As the user arrives home and looks at device 100, device 100 can display icon 1522 for unlocking a wireless-enabled front-door door lock. Device 100 can also display icon 1524 for controlling in-home electronics, such as lighting and furnace settings, through Wi-Fi enabled lighting and HVAC controllers. Device 100 can also display icon 1523 indicating a dinner event. Turning to FIG. 16, device 100 can display information relevant as the day ends. On user interface screen 1601, device 100 can display icon 1602 suggesting that the user should sleep soon, based on the user's usual sleep time and the next morning's calendared activities, for example. Device 100 can also display icon 1604 for controlling televisions, based on the user's habit of watching television at night. Device 100 can also display icon 1603 for lighting control, also based on the user's usual end-of-day routine. As the user's usual bed time continues to draw near, device 100 can display a summary of the user's physical activities for the day (i.e., icon 1612 indicating the user met 75% of their daily goal), and an alarm clock icon 1613 for setting an alarm for the next morning. Device 100 can also reduce the amount of user interface objects displayed at the end of the day. For example, as shown on screen 1621, device 100 can display a single icon 1622 suggesting sleep. In addition, icon 1622 can be displayed using light wavelengths that are less likely to interfere with a user's sleep pattern. In this way, device 100 can avoid keeping its user awake and/or awaking its sleeping user. Optionally, device 100 can be configured to display a clock face persistently. This aspect is described with respect to FIG. 17. In the illustrated example, device 100 displays user interface screen 1702 in response to the raising of the device into viewing position. On screen 1702, clock 1702 is displayed together with relevant icon 1703. As device 100 identifies additional relevant user interface objects, they can be displayed in the foreground of touchscreen 104 (FIG. 1) about the circumference of clock 1702, as demonstrated by additional relevant icons 1712 and 1713 on screens 1711 and 1721. In this way, a user can configure device 100 so as to emphasize its time-keeping function. FIG. 18 depicts exemplary process 1800 that can be performed by device 100 to display relevant user interface objects. At block 1810, device 100 obtains input from a movement sensor indicating movement of the device into a viewing position. In some embodiments, the movement can be an upward movement. At block 1820, device 100 obtains additional sensor data. Such sensor data can include GPS location information, lighting information, movement information, and/or accelerometer information. At block 1830, device 100 obtains application or operating system data. Such data can be obtained through a communication channel such as Wi-Fi, Bluetooth™, or NFC. At block 1840, device 100 identifies, based on the sensor data and application/OS data, user interface objects that are relevant for display to the user. Device 100 can also rank the relevant user interface objects. At block 1850, the most relevant user interface objects are displayed to the user. At block 1860, device 100 receives a input representing movement of an input mechanism. In response, at block 1870, device 100 displays icons representing the user's favorite applications available on the device. At block 1880, device 100 receives an additional input representing movement of an input mechanism. In response, at block 1890, device 100 displays icons representing all of the available applications on the device. Turning back to FIG. 7, memory section 708 of computing system 700 can be a non-transitory computer readable storage medium, for storing computer-executable instructions, which, when executed by one or more computer processors 706, for example, can cause the computer processors to perform the user interface techniques described above, including process 1800 (FIG. 18). The computer-executable instructions can also be stored and/or transported within any non-transitory computer readable storage medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For purposes of this document, a “non-transitory computer readable storage medium” can be any medium that can contain or store computer-executable instructions for use by or in connection with the instruction execution system, apparatus, or device. The non-transitory computer readable storage medium can include, but is not limited to, magnetic, optical, and/or semiconductor storages. Examples of such storage include magnetic disks, optical discs based on CD, DVD, or Blu-ray technologies, as well as RAM, ROM, EPROM, flash memory, and solid-state memory. Computing system 700 is not limited to the components and configuration of FIG. 7, but can include other or additional components in multiple configurations. Although the disclosure and examples have been fully described with reference to the accompanying figures, it is to be noted that various changes and modifications will become apparent to those skilled in the art. Such changes and modifications are to be understood as being included within the scope of the disclosure and examples as defined by the appended claims. 17190869 apple inc. USA B2 Utility Patent Grant (with pre-grant publication) issued on or after January 2, 2001. Open Apr 27th, 2022 08:45AM Apr 27th, 2022 08:45AM Technology Technology Hardware & Equipment Information Technology
nasdaq:aapl Apple Apr 26th, 2022 12:00AM Jun 26th, 2020 12:00AM https://www.uspto.gov?id=USD0949553-20220426 Accessory for an electronic device D949553 The ornamental design for an accessory for an electronic device, as shown and described. 1 FIG. 1 is a bottom front perspective view of an accessory for an electronic device showing the claimed design; FIG. 2 is a bottom rear perspective view thereof; FIG. 3 is a front view thereof; FIG. 4 is a rear view thereof; FIG. 5 is a right side view thereof; FIG. 6 is a left side view thereof; FIG. 7 is a top view thereof; FIG. 8 is a bottom view thereof; and, FIG. 9 is a perspective view thereof showing the accessory for an electronic device in an environment in which it may be used. The broken lines in FIG. 9 show environment that forms no part of the claimed design. 29739579 apple inc. USA S1 Design Patent Open D3/215 15 Apr 27th, 2022 08:45AM Apr 27th, 2022 08:45AM Technology Technology Hardware & Equipment Information Technology
nasdaq:aapl Apple Apr 26th, 2022 12:00AM Oct 20th, 2017 12:00AM https://www.uspto.gov?id=USRE049051-20220426 System and method for scaling up an image of an article displayed on a sales promotion web page Method and arrangement for scale-up of an image of an article displayed on a sales promotion web page is disclosed. The method includes displaying an image of an article on a sales promotion web page in a first scale. The method then includes, based on cursor-selection of the displayed image, enlarging the scale of the image to a second larger scale on the same web page. RE49051 1. A method comprising: displaying a first image in a main display area and a second image in a thumbnail display area on a web page hosted on a web server, wherein the first image is displayed in a first scale and the second image is displayed in a second scale that is smaller than the first scale; receiving a selection of the second image; based on the selection, moving the second image to a first location associated with the first image and moving the first image to a second location corresponding to a previous location of the second image on the web page displaying the second image in the main display area by replacing the first image, wherein the second image is enlarged to the first scale as the second image is moved to the first location, the enlarging being performed in a continuous and animated manner, without refreshing the web page presented in the first scale, and the first image is presented in the second scale; while the second image is located at the first location displayed in the main display area, detecting that an indicator is positioned at a first point over the second image, wherein movement of the indicator is controlled by a user; in response to detecting that the indicator is positioned at the first point, enlarging a portion of the second image corresponding to the first point, wherein the portion of the second image that is enlarged is presented in a third scale that is larger than the first scale; and after enlarging the portion of the second image, relocating web page elements to accommodate the enlarged portion of the second image, wherein relocating the web page elements comprises animating the web page elements to smoothly move from an original position and size to a new position and new size; receiving a further selection of an image presented in the second scale from the thumbnail display area; and presenting the selection of the image presented in the second scale in the same scale as the scale of a most recently enlarged image. 2. The method of claim 1, wherein enlarging the portion of the second image is performed by smoothly expanding the portion of the second image to the third scale. 3. The method of claim 2, wherein the portion of the second image that is enlarged to be presented in a third scale is presented at a first resolution level and a second portion of the second image that is not enlarged is presented at a second resolution level, the first resolution level being higher than the second resolution level that is higher than the resolution level of the second image that is presented at the first scale. 4. The method of claim 1, further comprising: displaying alternative thumbnail images on the web page in the thumbnail display area, wherein the second image comprises one of the alternative thumbnail images; receiving a designation of a selected thumbnail image, the selected thumbnail image being the second image; exchanging the selected thumbnail image with the first image by smoothly enlarging the selected thumbnail image, smoothly reducing the first image in size, and smoothly transitioning the first image to the thumbnail display area and the selected thumbnail image to each other's respective location to the main display area; and presenting an animated movement of information associated with at least one of the selected thumbnail image and the first image. 5. The method of claim 1, further comprising removing the web page elements to accommodate the second image. 6. The method of claim 1, further comprising converting the web page elements to semi-transparent form and at least partially overlaying the web page elements in semi-transparent form on the first second image. 7. A system comprising: a computer processor; and a memory containing instructions that, when executed, cause the computer processor to: display a first image in a main display area and a second image in a thumbnail display area on a web page hosted on a web server, wherein the first image is displayed in a first scale and the second image is displayed in a second scale that is smaller than the first scale; receive a selection of the second image; based on the selection, move the second image to a first location associated with the first image and moving the first image to a second location corresponding to a previous location of the second image on the web page display the second image in the main display area by replacing the first image, wherein the second image is enlarged to the first scale as the second image is moved to the first location, the enlarging being performed in a continuous and animated manner presented in the first scale, and the first image is presented in the second scale, without refreshing the web page; while the second image is located at the first location displayed in the main display area, detect that an indicator is positioned at a first point over the second image, wherein movement of the indicator is controlled by a user; in response to detecting that the indicator is positioned at the first point, enlarge a portion of the second image corresponding to the first point, wherein the portion of the second image that is enlarged is presented in a third scale that is larger than the first scale, and after enlarging the portion of the second image, relocate web page elements to accommodate the enlarged portion of the second image, wherein relocating the web page elements comprises animating the web page elements to smoothly move from an original position and size to a new position and new size; after enlarging the portion of the second image, receiving a selection of the first image or a third image; and in response to the selection of the first image or the third image, displaying the selected first image or third image in the third scale. 8. The system of claim 7, wherein enlarging the portion of the second image is performed by smoothly expanding the portion of the second image to the third scale. 9. The system of claim 8, wherein the portion of the second image that is enlarged to be presented in a third scale is presented at a first resolution level and a second portion of the second image that is not enlarged is presented at a second resolution level, the first resolution level being higher than the second resolution level that is higher than the resolution level of the second image that is presented at the first scale. 10. The system of claim 7, wherein the instructions further cause the computer processor to: display alternative thumbnail images on the web page in the thumbnail display area, wherein the second image comprises one of the alternative thumbnail images; receiving a designation of a selected thumbnail image, the selected thumbnail image being the second image; exchange the selected thumbnail image with the first image by smoothly enlarging the selected thumbnail image, smoothly reducing the first image in size, and smoothly transitioning the first image to the thumbnail display area and the selected thumbnail image to each other's respective location to the main display area; and present an animated movement of information associated with at least one of the selected thumbnail image and the first image. 11. The system of claim 7, wherein the instructions further cause the computer processor to: remove the web page elements to accommodate the second image. 12. The system of claim 7, wherein the instructions further cause the computer processor to: convert the web page elements to semi-transparent form and at least partially overlaying the web page elements in semi-transparent form on the first second image. 13. A non-transitory computer-readable medium storing instructions that, when executed by a computer processor, cause the computer processor to: display a first image in a main display area and a second image in a thumbnail display area on a web page hosted on a web server, wherein the first image is displayed in a first scale and the second image is displayed in a second scale that is smaller than the first scale; receive a selection of the second image; based on the selection, move the second image to a first location associated with the first image and moving the first image to a second location corresponding to a previous location of the second image on the web page display the second image in the main display area by replacing the first image, wherein the second image is enlarged to the first scale as the second image is moved to the first location, the enlarging being performed in a continuous and animated manner, without refreshing the web page presented in the first scale, and the first image is presented in the second scale; while the second image is located at the first location displayed in the main display area, detect that an indicator is positioned at a first point over the second image, wherein movement of the indicator is controlled by a user; in response to detecting that the indicator is positioned at the first point, enlarge at least a portion of the second image corresponding to the first point, wherein the at least the portion of the second image that is enlarged is presented in a third scale that is larger than the first scale, and after enlarging the at least the portion of the second image, relocate web page elements to accommodate the at least the portion of the second image, wherein relocating the web page elements comprises animating the web page elements to smoothly move from an original position and size to a new position and new size; receiving a further selection of an image presented in the second scale from the thumbnail display area; and presenting the selection of the image presented in the second scale in the same scale as the scale of a most recently enlarged image. 14. The non-transitory computer-readable medium of claim 13, wherein the portion of the second image that is enlarged is presented at a higher resolution level than the portion of the second image prior to being enlarged. 15. The non-transitory computer-readable medium of claim 13, wherein the instructions further cause the computer processor to: display alternative thumbnail images on the web page in the thumbnail display area, wherein the second image comprises one of the alternative thumbnail images; receiving a designation of a selected thumbnail image, the selected thumbnail image being the second image; exchange the selected thumbnail image with the first image by smoothly enlarging the selected thumbnail image, smoothly reducing the first image in size, and smoothly transitioning the first image to the thumbnail display area and the selected thumbnail to the main display area; and present an animated movement of information associated with at least one of the selected thumbnail image and the first image. 16. The non-transitory computer-readable medium of claim 13, wherein the instructions further cause the computer processor to: remove the web page elements to accommodate the second image. 17. The non-transitory computer-readable medium of claim 13, wherein the instructions further cause the computer processor to: convert the web page elements to semi-transparent form and at least partially overlaying the web page elements in semi-transparent form on the second image. 18. The method of claim 1, comprising displaying the enlarged portion in an overlay at least partially overlaying the web page elements. 19. The system of claim 7, wherein the instructions further cause the computer processor to: display the enlarged portion in an overlay at least partially overlaying the web page elements. 20. The non-transitory computer-readable medium of claim 13, wherein the instructions further cause the computer processor to: display the enlarged portion in an overlay at least partially overlaying the web page elements. 21. The method of claim 1, comprising: displaying alternative thumbnail images for the same item on the web page in the thumbnail display area, wherein the second image comprises one of the alternative thumbnail images. 22. The method of claim 1, wherein the relocating the web page elements to accommodate the enlarged portion of the second image, further includes relocating the web page elements to accommodate the thumbnail display area. 22 BACKGROUND 1. Field The present disclosure relates to displaying images of articles for sale on a web page, and more particularly to scaling up an image of an article displayed on a sales promotion web page. 2. Introduction With the development of the Internet, on-line stores have become very popular. These stores allow consumers with Internet access to browse and purchase articles for sale. Consumers typically access these stores through the World Wide Web via web pages viewed through a web browser. Examples of articles that can be purchased on-line include clothes, books, electronic devices, toys, games, downloadable media, travel reservations, and furniture. This is not an exhaustive list. Virtually any article that can be purchased in a traditional store can be bought from an on-line store. After the transaction is complete, the online store ships the purchased item if it is a tangible item. Alternatively, if the purchased article is digital media, the store can transfer the purchased digital content to the customer upon payment via download. Purchased digital content can be downloaded and played on personal computers, portable media players, smart phones, cell phones, televisions, television media players, video game devices and other electronic devices. On-line stores are popular with consumers because they can conveniently shop for a very large variety of articles whenever they are connected to the Internet. Generally, consumers desire an on-line store with a large selection of articles for sale with competitive prices. On-line store designers may benefit from showing a potential customer a large variety of products with smaller images, and then scale-up images of articles in which the customer indicates a particular interest in. An online store can alter the format of the web page to further tailor an article's presentation once a potential customer has indicated an interest in the item. Traditionally, online stores have provided an enlarged or more detailed view of an item by loading a different web page containing the more detailed image. Unfortunately, downloading a different web page often involves complexities such as the need for the user to manage multiple web pages, a browsing history, pop-ups, or other mechanisms, i.e. the original web page with the smaller image and the different web page with the enlarged image. Moreover, downloading a new web page ordinarily involves a momentary delay that is noticeable to the user during which the new page flashes into view. In some parts of the world where access to the Internet is metered and not unlimited, these extra page refreshes can add up to a significant expense. Therefore, what is needed is a system and method for scale-up of an image of an article displayed on a sales promotion web page that enlarges the image within the same web page and provides an enhanced user experience. SUMMARY Additional features and advantages of the disclosure will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the contained teachings. The features and advantages of the disclosure may be realized and obtained by means of the instruments and combinations particularly pointed out in the associated claims. These and other features of the disclosure will become more fully apparent from the following description and associated claims, or may be learned by the practice of the teachings set forth herein. This disclosure describes a system and method for scaling up an image of an article displayed on a sales promotion web page. Disclosed are systems, methods and computer readable media for accomplishing this scaling up an image of an article displayed on a sales promotion web page. Sales promotion web pages include any commerce related website. The method embodiment is illustrative of principles of the present disclosure and those same principles are also applicable to the system and computer readable medium embodiments. A method of scaling up an image of an article displayed on a sales promotion web page is disclosed. The method includes displaying an image of an article on a sales promotion web page in a first scale. Then, based on cursor-selection of the displayed image, the method includes enlarging the scale of the image to a second larger scale on the same web page. The present disclosure may apply to any image on a sales promotion web page. These images can be, for example, in a JPG, BMP, TIFF, GIF, or PNG format, among others. BRIEF DESCRIPTION OF THE DRAWINGS In order to better describe the manner in which the advantages and features of the disclosure can be obtained, a more detailed description follows with reference to specific embodiments that are illustrated in the accompanying drawings in which: FIG. 1 illustrates an example system embodiment; FIG. 2A illustrates an article for sale on a web page; FIG. 2B illustrates an example system embodiment of a scaled-up of an image of an article displayed on a sales promotion web page; FIG. 3A illustrates an example of how to enlarge the scale of a portion of the second larger scale image to a third even larger scale image based on cursor designation of the portion by continuously expanding the scale of the portion of the image from the second larger scale to the third even larger scale; FIG. 3B illustrates an example of how to replace the third even larger scale image portion with a higher resolution image of the same image portion; and FIG. 4 illustrates an example method embodiment for scale-up of an image of an article displayed on a sales promotion web page. DETAILED DESCRIPTION Various embodiments configured according to the present disclosure are discussed in detail below. While specific implementations are discussed, it should be understood that this is done for illustration purposes only. A person skilled in the relevant art will recognize that other components and configurations may be used without parting from the spirit and scope of the description. With reference to FIG. 1, an exemplary system comprises (includes) a general-purpose computing device 100, including a processing unit (CPU) 120 and a system bus 110 that couples various system components including the system memory such as read only memory (ROM) 140 and random access memory (RAM) 150 to the processing unit 120. Other system memory 130 may be available for use as well. It can be appreciated that certain presently described embodiments can operate on a computing device with more than one CPU 120 or on a group or cluster of computing devices networked together to provide greater processing capability. A processing unit 120 can include a general purpose CPU controlled by software as well as a special-purpose processor. An Intel Xeon LV L7345processor is an example of a general purpose CPU which is controlled by software. Particular functionality may also be built into the design of a separate computer chip. An STMicroelectronics STA013 processor is an example of a special-purpose processor which decodes MP3 audio files. Of course, a processing unit includes any general purpose CPU and a module configured to control the CPU as well as a special-purpose processor where software is effectively incorporated into the actual processor design. A processing unit may essentially be a completely self-contained computing system, containing multiple cores or CPUs, a bus, memory controller, cache, etcetera. A multi-core processing unit may be symmetric or asymmetric. The system bus 110 may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures. A basic input/output (BIOS) stored in ROM 140 or the like, may provide the basic routine that helps to transfer information between elements within the computing device 100, such as during start-up. The computing device 100 further includes storage devices such as a hard disk drive 160, a magnetic disk drive, an optical disk drive, tape drive or the like. The storage device 160 is connected to the system bus 110 by a drive interface. The drives and the associated computer readable media provide nonvolatile storage of computer readable instructions, data structures, program modules and other data for the computing device 100. In one aspect, a hardware module that performs a particular function includes the software component stored in a tangible computer-readable medium in connection with the necessary hardware components, such as the CPU, bus, display, and so forth, to carry out the function. The basic components are known to those of skill in the art and appropriate variations are contemplated depending on the type of device, such as whether the device is a small, handheld computing device, a desktop computer, or a computer server. Although the exemplary environment described herein employs the hard disk, it should be appreciated by those skilled in the art that other types of computer readable media which can store data that are accessible by a computer, such as magnetic cassettes, flash memory cards, digital versatile disks, cartridges, random access memories (RAMs), read only memory (ROM), a cable or wireless signal containing a bit stream and the like, may also be used in the exemplary operating environment. To enable user interaction with the computing device 100, an input device 190 represents any number of input mechanisms, such as a microphone for speech, a touch-sensitive screen for gesture or graphical input, keyboard, mouse, motion input, speech and so forth. The input may be used by the presenter to indicate the beginning of a speech search query. The device output 170 can also be one or more of a number of output mechanisms known to those of skill in the art. In some instances, multimodal systems enable a user to provide multiple types of input to communicate with the computing device 100. The communications interface 180 generally governs and manages the user input and system output. There is no restriction in the disclosure limiting operation to being on any particular hardware arrangement and therefore the basic features here may easily be substituted for improved hardware or firmware arrangements as they are developed. For clarity of explanation, the illustrative system embodiment is presented as comprising individual functional blocks (including functional blocks labeled as a “processor”). The functions these blocks represent may be provided through the use of either shared or dedicated hardware, including, but not limited to, hardware capable of executing software and hardware, such as a processor, that is purpose-built to operate as an equivalent to software executing on a general purpose processor. For example the functions of one or more processors presented in FIG. 1 may be provided by a single shared processor or multiple processors. (Use of the term “processor” should not be construed to refer exclusively to hardware capable of executing software.) Illustrative embodiments may comprise microprocessor and/or digital signal processor (DSP) hardware, read-only memory (ROM) for storing software performing the operations discussed below, and random access memory (RAM) for storing results. Very large scale integration (VLSI) hardware embodiments, as well as custom VLSI circuitry in combination with a general purpose DSP circuit, may also be provided. The logical operations of the various embodiments are implemented as: (1) a sequence of computer implemented steps, operations, or procedures running on a programmable circuit within a general use computer, (2) a sequence of computer implemented steps, operations, or procedures running on a specific-use programmable circuit; and/or (3) interconnected machine modules or program engines within the programmable circuits. Having disclosed the fundamental elements of an example system embodiment which can be configured to practice the principles described herein, the disclosure turns to various examples which are discussed in the context of the system embodiment. As noted above, the subject matter of the present disclosure enables the scale-up of an image of an article displayed on web page. Any on-line store that presents an image of an article for sale is contemplated as within the scope of this disclosure. Other web pages which display images and thumbnails are also within the scope of this disclosure. FIG. 2A illustrates an article for sale on a web page. The web page 200 displays an article for sale, which in this case is a nylon backpack. The web page displays a main image 202 for the article. The web page also includes thumbnail images 210 of various views of the article so that a user can examine the article. When the user selects a thumbnail image 210 based on cursor location or similar selection, the system exchanges the thumbnail image with the displayed image in a smooth, animated manner. The exchange can be exact, where the image and the thumbnail swap exact positions. The exchange can be approximate. For example, the image can return to its previous position in the set of thumbnails and the thumbnail can move to the main image's previous location, leaving a gap in the set of thumbnails. The transition is a smooth, fluid movement of both the image and the thumbnail. The image is shrunk down to a thumbnail scale and size, while the thumbnail is expanded to a larger scale and size. The web page includes a main name or title of the article 204. The web page further displays customer rating metadata 208. This customer rating metadata 208 indicates the average rating for users that have written a view, as well as a link to read the reviews or write a new review. On the right side of web page 200, the system displays price and shipping cost 212 and a link to add the item to a virtual cart, as well as an option to sign up for 1-click. 1-click is an option wherein a user can purchase items with a single click of the mouse. Additionally, the right side of web page 200 includes an option to select one of two colors 214 for the displayed item. The right side of the web page 200 includes save and print options 216. These allow a user to print and save the displayed information on the article for sale for later viewing. FIG. 2B illustrates a scaled up image of the article for sale in FIG. 2A. The system presents a scaled up image of the article for sale based on a cursor selection in one embodiment. Cursor selection can occur when a user moves a cursor over the displayed image. However, a user can trigger a cursor selection with the system based on a certain number of clicks on the displayed image. For example, an online store can designate a single click to display the image and a double click on the displayed image to act as a cursor selection and trigger the transformation to the scaled-up image of the article. As will be appreciated by those persons skilled in these arts, other types of cursor activity, such as mouse gestures, can also define a cursor selection. The web page 200 displays the same article for sale as FIG. 2A, which in this case is a nylon backpack. The web page displays an enlarged main image 220 for the article. The system enlarges the main image 220 to a second, larger scale on the same web page 200. During the scaling-up from the first image 202 to the second image 220, the system shills the image to a second location on the same web page 200 while simultaneously scaling-up from the first scale image 202 to the second larger scale image 220. In this embodiment the enlarged scale main image 220 is enlarged by continuously and smoothly expanding the scale of the image from the first to the second larger scale. Such a continuous and smooth expansion can provide an animation-like effect. The web page 200 also includes thumbnail images 222 where a user can select other images of the article. The thumbnail images 222 differ in quantity and location from those depicted in FIG. 2A. The web page 200 includes a main name of the article 204 in roughly the same position as compared to FIG. 2A. The web page 200 further displays customer rating metadata 208. In this embodiment, the system moves the customer rating metadata 208 from a bottom left corner of web page 200 to the bottom right corner of web page 200 to allow for the enlarged main image 220. The customer rating metadata 208 options are unchanged from FIG. 2A and indicate the average rating for users that have written a view, as well as a link to read the reviews or write a new review. On the right side of web page 200, the system displays the price, shipping cost, and a link to add the item to a virtual cart 212, as well as an option to sign up for 1-click. Additionally, the right side of web page 200 includes an option to select one of two colors 214 for the displayed item. Also, the right side of the web page 200 includes save and print options 216. The system does not change the location of the price and shipping area information 212, the option to select one of two color 214 for the displayed item, and save and print options 216 from their respective positions in FIG. 2A and FIG. 2B. These options, as well as any other information presented along with an article for sale, may be relocated on a web page, redacted, edited, or removed to accommodate the enlargement of the main image. In one embodiment, the system animates the web page elements which change to accommodate the enlarged image so they move smoothly from their original position and size to their new position and size in the same manner as the image and the thumbnail. FIG. 3A illustrates enlarging the scale of a portion of the second larger scale image to a third even larger scale image based on cursor designation of the portion. The system can enlarge the image by continuously expanding the scale of the portion of the image from the second larger scale to the third even larger scale. As shown in FIG. 3A the web page 300 displays a third larger scale image 302 based on cursor designation of the portion. In one example, the user selects the portion of the backpack image showing the name tag of the backpack by single clicking a mouse near the name tag of the backpack on the image. However, as described above, the cursor designation can be defined as any input from any human interface device from the user to affect the cursor, such as single-clicking, double-clicking, or gesturing on a desired designated area on a displayed article image. Also, the system can define right button mouse or trackball clicks to act as a cursor designation. The system can display all of the scale-ups of an image of an article on a web page within the same page without refreshing or navigating elsewhere. A system, such as a web server, can use Asynchronous Javascript and XML (AJAX) to provide such an effect. It is also contemplated that other technologies, languages, and web platforms can be used. The web page 300 further displays customer rating metadata 308. On the right side of web page 300 a price and shipping area 312 displays price, shipping cost, and a link to add the item to a virtual cart, as well as an option to sign up for 1-click. Additionally, the right side of web page 300 includes an option to select one of two colors 314 for the displayed item. Also, the right side of the web page 300 includes save and print options 316. FIG. 3B illustrates a higher resolution replacement image 304 that replaces the third even larger scale image portion 302. On the right side of web page 300 a price and shipping area 312 displays price, shipping cost, and a link to add the item to a virtual cart, as well as an option to sign up for 1-click. Additionally, the right side of web page 300 includes an option to select one of two colors 314 for the displayed item. The right side of the web page 300 includes save and print options 316. The location of price and shipping area information 312, an option to select one of two color 314 for the displayed item, and save and print options 316 remain in the same relative positions in FIG. 3A and FIG. 3B. However, these options, as well as any other information presented along with an article for sale may be relocated, edited, redacted, or removed upon the replacement of the third larger scale image 302 with the higher resolution image 304. FIG. 4 illustrates an example method embodiment for sealing up an image of an article displayed on a sales promotion web page. The method embodiment is discussed in terms of a system configured to practice the method. Such a system can be a single web server, a cluster of web servers, a personal computer, a PDA, smartphone, etcetera. The system displays an image of an article on a web page in a first scale (402). The system enlarges the image to a second larger scale on the same web page based on a selection of the displayed image by smoothly expanding the image from the first scale to the second larger scale (404). In one embodiment, the system shifts the image to a second location on the same web page while simultaneously enlarging the image. In another embodiment, the system enlarges a portion of the second larger scale image to a third even larger scale image based on a designation of the portion by smoothly expanding the portion of the image from the second larger scale to the third even larger scale. In one variation on this embodiment, the system replaces the third even larger scale image portion with a corresponding higher resolution image. A user who desires to see fine details of an article may find this variation useful. For example, with articles of clothing, a high-level picture of the entire blouse is insufficient to convey the subtle colors and textures which are more apparent when viewed from a shorter distance. With articles of this sort, users appreciate and benefit from enlarged, higher resolution images. In one embodiment involving thumbnails, the system displays alternative thumbnail images of the article on the same web page, receives a designation of a selected thumbnail image, and exchanges the selected thumbnail with the enlarged image by simultaneously smoothly enlarging the thumbnail, smoothly reducing the enlarged image, and smoothly transitioning the image and the thumbnail to each other's respective location. This effect provides immediate feedback to a user so that the user knows where an image conies from and where an image goes. Users can track each image and thumbnail visually through the smoothly animated motion and scaling. In one variation, the system remembers the scale of the enlarged image and applies it to an exchanged thumbnail. For example, if image A is zoomed into a higher resolution view, and thumbnail B is selected to replace it, the system swaps the image A and thumbnail B. The system further enlarges thumbnail B to the same level of detail and resolution at which image A had been before. In a second variation, if image A is zoomed into a higher resolution view, and thumbnail B is selected to replace it, the system swaps the image A and thumbnail B. However, in this second variation the system enlarges thumbnail B to a default level of detail and resolution. Then upon a user selection of a portion of image B the system enlarges image B to the same level of detail and resolution at which image A had been before. In order to accommodate a larger scale image, the system can remove, relocate, or reformat web page elements such as the title, price, reviews, overview, etcetera shown in FIGS. 2A, 2B, 3A, and 3B. The system can further overlay semitransparent information, such as web page elements, over the second larger scale image, at least partially. The system can overlay the information in order to allow a larger image. In some instances, a larger, partially obscured image is desirable over a smaller, unobscured image. Embodiments within the scope of the present disclosure may also include computer-readable media for carrying or having computer-executable instructions or data structures stored thereon. Such computer-readable media can be any available media that can be accessed by a general purpose or special purpose computer, including the functional design of any special purpose processor as discussed above. By way of example, and not limitation, such computer-readable media can comprise RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to carry or store desired program code means in the form of computer-executable instructions, data structures, or processor chip design. When information is transferred or provided over a network or another communications connection (either hardwired, wireless, or combination thereof) to a computer, the computer properly views the connection as a computer-readable medium. Thus, any such connection is properly termed a computer-readable medium. Combinations of the above are also included within the scope of the described computer-readable media. Computer-executable instructions include, for example, instructions and data which cause a general purpose computer, special purpose computer, or special purpose processing device to perform a certain function or group of functions. Computer-executable instructions also include program modules that are executed by computers in stand-alone or network environments. Generally, program modules include routines, programs, objects, components, data structures, and the functions inherent in the design of special-purpose processors, etcetera that perform particular tasks or implement particular abstract data types. Computer-executable instructions, associated data structures, and program modules represent examples of the program code means for executing steps of the methods disclosed herein. The particular sequence of such executable instructions or associated data structures represents examples of corresponding acts for implementing the functions described in such steps. Those of skill in the art will appreciate that other embodiments configured according to these teachings may be practiced in network computing environments with many types of computer system configurations, including personal computers, hand-held devices, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, and the like. Embodiments may also be practiced in distributed computing environments where tasks are performed by local and remote processing devices that are linked (either by hardwired links, wireless links, or by a combination thereof) through a communications network. In a distributed computing environment, program modules may be located in both local and remote memory storage devices. Although the above description may contain specific details, it should not be construed as limiting to the claims in any way. Other configurations of the described embodiments also fit within the scope of what is claimed. For example, the principles described herein can be applied to non-commercial, non-advertising images, such as an online photo album. Accordingly, the claims and their legal equivalents are what define the patent coverage contained herein, rather than any specific examples given. 15789079 apple inc. USA E1 Reissue Patent Open Apr 27th, 2022 08:45AM Apr 27th, 2022 08:45AM Technology Technology Hardware & Equipment Information Technology
nasdaq:aapl Apple Apr 26th, 2022 12:00AM Aug 17th, 2020 12:00AM https://www.uspto.gov?id=US11317398-20220426 Semi-persistent scheduling for autonomous transmission activation and release Described is an apparatus of a User Equipment (UE). The apparatus may comprise a first circuitry and a second circuitry. The first circuitry may be operable to process a first Downlink Control Information (DCI) format 0A transmission indicating a semi-persistent scheduling (SPS) activation. The first circuitry may also be operable to process a second DCI format 0A transmission indicating an SPS release. The second circuitry may be operable to generate one or more Uplink (UL) transmissions for an unlicensed spectrum of the wireless network after the SPS activation and before the SPS release in accordance with a configured schedule. 11317398 1. A User Equipment (UE) configured to communicate with a base station on a wireless network, comprising: one or more processors configured to: process a first Downlink Control Information (DCI) transmission indicating a semi-persistent scheduling (SPS) activation; and generate one or more Physical Uplink Shared Channel (PUSCH) transmissions for an unlicensed spectrum of the wireless network after the SPS activation and before an SPS release; and transceiver circuitry configured to transmit the one or more PUSCH transmissions and to receive the first DCI transmission. 2. The UE of claim 1, wherein the one or more processors are further configured to process a second DCI transmission indicating the SPS release. 3. The UE of claim 2, wherein the first DCI transmission or the second DCI transmission has DCI format 0A, DCI format 0B, DCI format 4A, DCI format 4B, or a DCI format dedicated to autonomous uplink (AUL) activation and AUL release implemented by the SPS activation and the SPS release. 4. The UE of claim 2, wherein the first DCI transmission or the second DCI transmission includes an additional indicator related to one or more resources where autonomous uplink (AUL) is performed. 5. The UE of claim 2, wherein the first DCI transmission or the second DCI transmission includes a format flag. 6. The UE of claim 1, wherein the first DCI transmission includes a DCI transmission carrying a UE-specific parameter. 7. The UE of claim 1, wherein the first DCI transmission is scrambled with an Autonomous Uplink Cell Radio Network Temporary Identifier (AUL-C-RNTI). 8. The UE of claim 1, wherein the wireless network supports dual connectivity (DC) based Licensed-Assisted Access (LAA) in the unlicensed spectrum or standalone access in the unlicensed spectrum. 9. A method comprising: processing, by a User Equipment (UE), a first Downlink Control Information (DCI) transmission indicating a semi-persistent scheduling (SPS) activation; generating, by the UE, one or more Physical Uplink Shared Channel (PUSCH) transmissions for an unlicensed spectrum of a wireless network after the SPS activation and before an SPS release; and transmitting, by the UE, the one or more PUSCH transmissions to a base station of the wireless network. 10. The method of claim 9, further comprising: processing, by the UE, a second DCI transmission indicating the SPS release. 11. The method of claim 10, wherein the first DCI transmission or the second DCI transmission has DCI format 0A, DCI format 0B, DCI format 4A, DCI format 4B, or a DCI format dedicated to autonomous uplink (AUL) activation and AUL release implemented by the SPS activation and the SPS release. 12. The method of claim 9, wherein the first DCI transmission includes a DCI transmission carrying a UE-specific parameter. 13. The method of claim 9, wherein the first DCI transmission is scrambled with an Autonomous Uplink Cell Radio Network Temporary Identifier (AUL-C-RNTI). 14. The method of claim 9, wherein the wireless network supports dual connectivity (DC) based Licensed-Assisted Access (LAA) in the unlicensed spectrum or standalone access in the unlicensed spectrum. 15. A User Equipment (UE) configured to communicate with a base station on a wireless network, comprising: one or more processors to: process a first Downlink (DL) transmission scrambled with an Autonomous Uplink Cell Radio Network Temporary Identifier (AUL-C-RNTI), the first DL transmission indicating a semi-persistent scheduling (SPS) activation; generate one or more Uplink (UL) transmissions for an unlicensed spectrum of the wireless network after the SPS activation and before an SPS release; and an interface for sending the one or more UL transmissions to a transmission circuitry and for receiving the first DL transmission from a receiving circuitry. 16. The UE of claim 15, wherein the one or more processors are further configured to process a second DL transmission scrambled with the AUL-C-RNTI, the second DL transmission indicating the SPS release. 17. The UE of claim 16, wherein at least one of the first DL transmission and the second DL transmission carries a UE-specific parameter. 18. The UE of claim 16, wherein the first DCI transmission or the second DCI transmission includes an additional indicator related to one or more resources where autonomous uplink (AUL) is performed. 19. The UE of claim 16, wherein the first DCI transmission or the second DCI transmission includes a format flag. 20. The UE of claim 15, wherein the wireless network supports dual connectivity (DC) based Licensed-Assisted Access (LAA) in the unlicensed spectrum or standalone access in the unlicensed spectrum. 20 CLAIM OF PRIORITY This application is a continuation of U.S. Non-Provisional application Ser. No. 16/476,038, filed Jul. 3, 2019, which is a National Stage Entry of and claims priority to, PCT Application No. PCT/US2018/014558, filed on Jan. 19, 2018 and titled “Semi-Persistent Scheduling For Autonomous Transmission Activation and Release,” which claims priority under 35 U.S.C. § 119(e) to U.S. Provisional Patent Application 62/448,147 filed Jan. 19, 2017, all of which are herein incorporated by reference in their entireties. BACKGROUND A variety of wireless cellular communication systems have been implemented, including a 3rd Generation Partnership Project (3GPP) Universal Mobile Telecommunications System, a 3GPP Long-Term Evolution (LTE) system, and a 3GPP LTE-Advanced (LTE-A) system. Next-generation wireless cellular communication systems based upon LTE and LTE-A systems are being developed, such as a fifth generation (5G) wireless system/5G mobile networks system. Next-generation wireless cellular communication systems may provide support for higher bandwidths in part by using unlicensed spectrum BRIEF DESCRIPTION OF THE DRAWINGS The embodiments of the disclosure will be understood more fully from the detailed description given below and from the accompanying drawings of various embodiments of the disclosure. However, while the drawings are to aid in explanation and understanding, they are only an aid, and should not be taken to limit the disclosure to the specific embodiments depicted therein. FIG. 1 illustrates a scenario of one or more Evolved Node Bs (eNBs) and one or more User Equipments (UEs), in accordance with some embodiments of the disclosure. FIG. 2 illustrates an eNB and a UE, in accordance with some embodiments of the disclosure. FIG. 3 illustrates hardware processing circuitries for a UE for SPS activation and/or SPS release, in accordance with some embodiments of the disclosure. FIGS. 4-5 illustrate methods for a UE for SPS activation and/or SPS release, in accordance with some embodiments of the disclosure. FIG. 6 illustrates example components of a device, in accordance with some embodiments of the disclosure. FIG. 7 illustrates example interfaces of baseband circuitry, in accordance with some embodiments of the disclosure. DETAILED DESCRIPTION Various wireless cellular communication systems have been implemented or are being proposed, including a 3rd Generation Partnership Project (3GPP) Universal Mobile Telecommunications System (UMTS), a 3GPP Long-Term Evolution (LTE) system, a 3GPP LTE-Advanced (LTE-A) system, and a 5th Generation (5G) wireless system/5G mobile networks system. Due to the popularity of mobile devices and smart devices, the widespread adoption of wireless broadband has resulted in significant growth in the volume of mobile data traffic and has radically impacted system requirements, sometimes in divergent ways. For example, while it may be important to lower complexity, elongate battery life, and support highly mobility and service continuity of devices, it may also be important to increase data rates and bandwidths and lower latencies to support modern applications. To meet the needs of future wireless networks, various physical layer techniques have been introduced (e.g, Multiple Input Multiple Output (MIMO) techniques, enhanced Inter-Cell Interference Coordination (ICIC) designs, coordinated multi-point designs, and so on). An increasing interest has also arisen in operating cellular networks in unlicensed spectrum to ameliorate the scarcity of licensed spectrum in low frequency bands, with the aim to further improve data rates. One enhancement for LTE in 3GPP Release 13 has been to enable operation in unlicensed spectrum via Licensed-Assisted Access (LAA), which may expand a system bandwidth by utilizing a flexible carrier aggregation (CA) framework introduced by the LTE-Advanced system. Enhanced operation of LTE systems in unlicensed spectrum is also expected in future releases, as well as in 5G systems. Potential LTE operations in unlicensed spectrum may include (but not be limited to) LTE system operation in the unlicensed spectrum via Dual Connectivity (DC) (e.g., DC-based LAA). Potential LTE operations in unlicensed spectrum may also include LTE-based technology operating solely in unlicensed spectrum without relying upon an “anchor” in licensed spectrum, such as in MulteFire™ technology by MulteFire Alliance of Fremont Calif., USA. Standalone LTE operation in unlicensed spectrum may combine performance benefits of LTE technology with a relative simplicity of Wi-Fi®-like deployments. (Wi-Fi® is a registered trademark of the Wi-Fi Alliance of Austin, Tex., USA.) Standalone LTE operation may accordingly be an advantageous technology in meeting demands of ever-increasing wireless traffic. An unlicensed-spectrum frequency band of current interest for 3GPP systems is the 5 gigahertz (GHz) band, which may present a wide spectrum with global common availability. The 5 GHz band in the US is governed by Unlicensed National Information Infrastructure (U-NII) rules of the Federal Communications Commission (FCC). The primary incumbent systems in the 5 GHz band may be Wireless Local Area Networks (WLAN) systems, specifically those based on IEEE 802.11 a/n/ac technologies. Since WLAN systems may be widely deployed both by individuals and operators for carrier-grade access service and data offloading, sufficient care should be taken before deployment of coexisting 3GPP systems. Accordingly, Listen-Before-Talk (LBT) may be a feature of Release 13 LAA systems to promote fair coexistence with incumbent systems. In an LBT procedure, a radio transmitter may first sense a medium and may transmit if the medium is sensed to be idle. Meanwhile, in scheduled-based Uplink (UL) designs, UL Physical Uplink Shared Channel (PUSCH) transmission may be determined based on explicit UL grant transmission via Physical Downlink Control Channel (PDCCH) (e.g., via Downlink Control Information (DCI) format 0). UL grant transmission may be performed after completing an LBT procedure at an Evolved Node-B (eNB). After receiving an UL grant, a scheduled User Equipment (UE) may perform a short LBT or Category 4 (Cat 4) LBT during an allocated time interval. If the LBT is successful at the scheduled UE, then UE may transmit PUSCH on resources indicated by the UL grant. Due to the double LBT requirement at both eNB (when sending the UL grant) and at the scheduled UEs (before UL transmission), UL performance in unlicensed spectrum may be significantly degraded by UL starvation. This is a general problem when a scheduled system (such as LTE) coexists with a non-scheduled autonomous system (such as Wi-Fi®). Accordingly, in various embodiments, autonomous UL (AUL) transmission (which may also be termed General UL (GUL) transmission) may be employed to improve the performance of UL transmission. AUL may be activated, released, and configured in a variety of manners. In a first type of embodiment, a Radio Resource Control (RRC) protocol may be used. This protocol may support the transfer of both common Non-Access Stratum (NAS) information and dedicated NAS information related to specific UEs. RRC connection establishment may include the establishment of a Signaling Radio Bearer 1 (SRB1) and the transfer of an initial uplink NAS message. This NAS message may trigger an establishment of an Si connection, which may then initiate a subsequent step during which an Evolved Universal Mobile Telecommunications Systems Terrestrial Radio Access Network (E-UTRAN) may activate Access Stratum (AS) security, and/or may establish a Signaling Radio Bearer 2 (SRB2) and one or more Data Radio Bearers (DRBs). This entire procedure may take 16 or more milliseconds (ms). RRC connection release may be initiated by an eNB following release of an Si connection between the eNB and a Core Network (CN). In a second type of embodiment, a semi-persistent scheduling (SPS) scheme may be adopted. The periodicity of a semi-persistent scheduled transmission may be configured by RRC signaling in advance, while activation may be done using PDCCH and/or enhanced PDCCH (ePDCCH) using a semi-persistent Cell Radio Network Temporary Identifier (C-RNTI). After enabling SPS, a terminal may continue to monitor PDCCH and/or ePDCCH for DL scheduling commands. When a dynamic scheduling command is detected, it may take precedence over SPS in that particular subframe, which may be useful if semi-persistently allocated resources occasionally need to be increased. The first type of embodiment and the second type of embodiment may both be feasible, although each type of embodiment may have its advantages and disadvantages. While RRC-based embodiments may be simpler and may not incorporate various changes to support an AUL activation and release procedure, RRC protocols may need more time for activation and release compared to SPS, and may induce higher latencies. This may be due to SPS being primarily affected by typical Hybrid Automatic Repeat Request (HARQ) Acknowledgement (ACK) delays, which may be shorter than typical RRC procedure delays. However, legacy SPS activation and release mechanisms may not be suitable for signaling AUL activation and release. Legacy SPS mechanisms may be based on the use of DCI format 0, which may have a format in accordance with Table 1 below. (In Table 1, N may be a number of resource blocks scheduled for UL.) TABLE 1 DCI format 0 Field Length in Bits Carrier indicator 3 Format for format 0/format 1A (e.g., format flag (FF)) 1 Frequency hopping (FH) flag Log2[N(N + 1)/2] Modulation and coding scheme and redundancy 5 version New data indicator 1 TPC command 2 Cyclic shift for DM-RS and OCC index (CS/OCC) 3 UL index 2 Downlink assignment index (DAI) 2 CSI request 1, 2, or 3 SRS request 0 or 1 Resource allocation type 1 In accordance with legacy SPS activation and release mechanisms, some of the fields of DCI format 0 may be set to some specific values in order to signal AUL activation and/or AUL release. In particular, signaling of AUL activation may employ settings provided in Table 2 below (related to DCI format 0 for SPS activation PDCCH/ePDCCH validation), while signaling of AUL release may employ settings provided in Table 3 below (related to DCI format 0 for SPS release PDCCH/ePDCCH validation). TABLE 2 DCI format 0 fields for SPS activation Field Setting for DCI format 0 TPC command Set to ‘00’ Cyclic shift for DM-RS and OCC index Set to ‘000’ (CS/OCC) Modulation and coding scheme and redundancy MSB is set to ‘0’ version HARQ process number N/A Modulation and coding scheme N/A Redundancy version N/A TABLE 3 DCI format 0 fields for SPS release Field Setting for DCI format 0 TPC command Set to ‘00’ Cyclic shift for DM-RS and OCC index Set to ‘000’ (CS/OCC) Modulation and coding scheme (MCS) and Set to ‘11111’ redundancy version Resource block assignment and hopping Set to all ‘1’s resource allocation HARQ process number N/A Modulation and coding scheme N/A Redundancy version N/A Resource block assignment N/A Unfortunately, DCI format 0 may not be supported in enhanced LAA (eLAA) and/or MulteFire™. Instead, DCI format 0A, DCI format 0B, DCI format 4A, and/or DCI format 4B may be used, which may have fields as summarized in Table 4 below (in which “S” may indicate a number of scheduled subframes). While these DCIs may carry fields similar to format DCI 0, and may serve as replacements, they might not support PDCCH and/or ePDCCH SPS activation and SPS release functionalities. TABLE 4 DCI format 0A/0B/4A/4B fields DCI DCI DCI DCI format format format format 0A 0B 4A 4B Fields (bits) (bits) (bits) (bits) Carrier indicator 0 or 3 0 or 3 0 or 3 0 or 3 Format 0/1A format 1 — — — PUSCH or MulteFire ™ 1 1 1 1 ePUCCH enhanced Physical Uplink Control Channel (ePUCCH) trigger A Timing offset 4 4 4 4 Resource block assignment 4 or 6 4 or 6 4 or 6 4 or 6 Modulation and coding 5 5 10  10  scheme HARQ ID 4 4 4 4 New Data Indicator 1 S 2  2S Redundancy version 2 S 2 S TPC command 2 2 2 2 Cyclic shift for DM-RS 3 3 3 3 and OCC index CSI request 1, 2, or 1, 2, or 1, 2, or 1, 2, or 3 3 3 3 HARQ-ACK request 1 1 1 1 SRS request 1 2 2 2 PUSCH or MulteFire ™ 2 2 2 2 ePUCCH starting position PUSCH or MulteFire ™ 1 1 1 1 ePUCCH ending symbol Channel access type 1 1 1 1 Channel access priority 2 2 2 2 class Number of scheduled — 1 or 2 — 1 or 2 subframes Discussed herein are various embodiments for AUL activation and/or AUL release which may employ an SPS approach (which may be referred to for purposes of this application as SPS AUL activation and/or SPS AUL release, and/or as SPS activation and/or SPS release). Some embodiments may employ a “clean slate” solution with a design of a new DCI format dedicated to this functionality. Some embodiments may employ reuse of and/or extension of DCI format 0 scheduling of PUSCH with support for SPS. Some embodiments may employ reuse of DCI format 0A, DCI format 4A, DCI format 0B, and/or DCI format 4B for the scheduling of PUSCH. Some embodiments may use a number N of bits in a common PDCCH (cPDCCH) reserved for these functionalities. Some embodiments may employ UE-group-specific or cell-specific DCI. Some embodiments may employ cell-specific SPS RRC. In addition, in various embodiments, DCI may include additional information and/or indicators related to one or more resources where AUL may be performed, and additional information and/or indicators related to one or more resources that may be used for AUL to configure activation and/or release. In the following description, numerous details are discussed to provide a more thorough explanation of embodiments of the present disclosure. It will be apparent to one skilled in the art, however, that embodiments of the present disclosure may be practiced without these specific details. In other instances, well-known structures and devices are shown in block diagram form, rather than in detail, in order to avoid obscuring embodiments of the present disclosure. Note that in the corresponding drawings of the embodiments, signals are represented with lines. Some lines may be thicker, to indicate a greater number of constituent signal paths, and/or have arrows at one or more ends, to indicate a direction of information flow. Such indications are not intended to be limiting. Rather, the lines are used in connection with one or more exemplary embodiments to facilitate easier understanding of a circuit or a logical unit. Any represented signal, as dictated by design needs or preferences, may actually comprise one or more signals that may travel in either direction and may be implemented with any suitable type of signal scheme. Throughout the specification, and in the claims, the term “connected” means a direct electrical, mechanical, or magnetic connection between the things that are connected, without any intermediary devices. The term “coupled” means either a direct electrical, mechanical, or magnetic connection between the things that are connected or an indirect connection through one or more passive or active intermediary devices. The term “circuit” or “module” may refer to one or more passive and/or active components that are arranged to cooperate with one another to provide a desired function. The term “signal” may refer to at least one current signal, voltage signal, magnetic signal, or data/clock signal. The meaning of “a,” “an,” and “the” include plural references. The meaning of “in” includes “in” and “on.” The terms “substantially,” “close,” “approximately,” “near,” and “about” generally refer to being within +/−10% of a target value. Unless otherwise specified the use of the ordinal adjectives “first,” “second,” and “third,” etc., to describe a common object, merely indicate that different instances of like objects are being referred to, and are not intended to imply that the objects so described must be in a given sequence, either temporally, spatially, in ranking, or in any other manner. It is to be understood that the terms so used are interchangeable under appropriate circumstances such that the embodiments of the invention described herein are, for example, capable of operation in other orientations than those illustrated or otherwise described herein. The terms “left,” “right,” “front,” “back,” “top,” “bottom,” “over,” “under,” and the like in the description and in the claims, if any, are used for descriptive purposes and not necessarily for describing permanent relative positions. For purposes of the embodiments, the transistors in various circuits, modules, and logic blocks are Tunneling FETs (TFETs). Some transistors of various embodiments may comprise metal oxide semiconductor (MOS) transistors, which include drain, source, gate, and bulk terminals. The transistors may also include Tri-Gate and FinFET transistors, Gate All Around Cylindrical Transistors, Square Wire, or Rectangular Ribbon Transistors or other devices implementing transistor functionality like carbon nanotubes or spintronic devices. MOSFET symmetrical source and drain terminals i.e., are identical terminals and are interchangeably used here. A TFET device, on the other hand, has asymmetric Source and Drain terminals. Those skilled in the art will appreciate that other transistors, for example, Bi-polar junction transistors-BJT PNP/NPN, BiCMOS, CMOS, etc., may be used for some transistors without departing from the scope of the disclosure. For the purposes of the present disclosure, the phrases “A and/or B” and “A or B” mean (A), (B), or (A and B). For the purposes of the present disclosure, the phrase “A, B, and/or C” means (A), (B), (C), (A and B), (A and C), (B and C), or (A, B and C). In addition, the various elements of combinatorial logic and sequential logic discussed in the present disclosure may pertain both to physical structures (such as AND gates, OR gates, or XOR gates), or to synthesized or otherwise optimized collections of devices implementing the logical structures that are Boolean equivalents of the logic under discussion. In addition, for purposes of the present disclosure, the term “eNB” may refer to a legacy LTE capable Evolved Node-B (eNB), a next-generation or 5G capable eNB, an Access Point (AP), and/or another base station for a wireless communication system. The term “gNB” may refer to a 5G-capable or NR-capable eNB. For purposes of the present disclosure, the term “UE” may refer to a legacy LTE capable User Equipment (UE), a Station (STA), and/or another mobile equipment for a wireless communication system. The term “UE” may also refer to a next-generation or 5G capable UE. Various embodiments of eNBs and/or UEs discussed below may process one or more transmissions of various types. Some processing of a transmission may comprise demodulating, decoding, detecting, parsing, and/or otherwise handling a transmission that has been received. In some embodiments, an eNB or UE processing a transmission may determine or recognize the transmission's type and/or a condition associated with the transmission. For some embodiments, an eNB or UE processing a transmission may act in accordance with the transmission's type, and/or may act conditionally based upon the transmission's type. An eNB or UE processing a transmission may also recognize one or more values or fields of data carried by the transmission. Processing a transmission may comprise moving the transmission through one or more layers of a protocol stack (which may be implemented in, e.g., hardware and/or software-configured elements), such as by moving a transmission that has been received by an eNB or a UE through one or more layers of a protocol stack. Various embodiments of eNBs and/or UEs discussed below may also generate one or more transmissions of various types. Some generating of a transmission may comprise modulating, encoding, formatting, assembling, and/or otherwise handling a transmission that is to be transmitted. In some embodiments, an eNB or UE generating a transmission may establish the transmission's type and/or a condition associated with the transmission. For some embodiments, an eNB or UE generating a transmission may act in accordance with the transmission's type, and/or may act conditionally based upon the transmission's type. An eNB or UE generating a transmission may also determine one or more values or fields of data carried by the transmission. Generating a transmission may comprise moving the transmission through one or more layers of a protocol stack (which may be implemented in, e.g., hardware and/or software-configured elements), such as by moving a transmission to be sent by an eNB or a UE through one or more layers of a protocol stack. In various embodiments, resources may span various Resource Blocks (RBs), Physical Resource Blocks (PRBs), and/or time periods (e.g., frames, subframes, and/or slots) of a wireless communication system. In some contexts, allocated resources (e.g., channels, Orthogonal Frequency-Division Multiplexing (OFDM) symbols, subcarrier frequencies, resource elements (REs), and/or portions thereof) may be formatted for (and prior to) transmission over a wireless communication link. In other contexts, allocated resources (e.g., channels, OFDM symbols, subcarrier frequencies, REs, and/or portions thereof) may be detected from (and subsequent to) reception over a wireless communication link. FIG. 1 illustrates a scenario of one or more eNBs and one or more UEs, in accordance with some embodiments of the disclosure. A scenario 100 may comprise a first eNB 110 serving a first cell 111 and a second eNB 120 serving a second cell 121. A first UE 130 may be positioned with respect to first cell 111 and second cell 121 in such a way as to permit wireless communication with both first eNB 110 and second eNB 120, whereas a second UE 140 may be positioned with respect to first cell 111 and second cell 121 in such a way as to permit wireless communication merely with second eNB 120. First eNB 110 may support both DC-based LAA in unlicensed spectrum and standalone access in unlicensed spectrum, while second eNB 120 may merely support standalone access in unlicensed spectrum. Accordingly, first UE 130 may employ either DC-based LAA in unlicensed spectrum or standalone access in unlicensed spectrum, while second UE 140 may employ standalone access in unlicensed spectrum. Either first UE 130 or second UE 140 may employ an SPS approach for AUL activation and/or AUL release. Embodiments falling within a first type may employ a “clean slate” solution with a design of a new DCI format dedicated to this functionality. To support AUL activation and/or AUL release, a new DCI format may be defined, which may be scrambled via SPS C-RNTI or via a newly-defined RNTI (for example, via an Autonomous Uplink Cell Radio Network Temporary Identifier (AUL-C-RNTI)) which may use reserved RNTI values. The DCI may have structure including one or more of the fields in Table 5 below. TABLE 5 New DCI format Field Length in bits Carrier indicator 3 Format flag (FF) 1 Resource block assignment (RBA) 4 or 6 Modulation and coding scheme (MCS) 5 TPC command 2 Cyclic shift (CS) for DM-RS and 3 Orthogonal Cover Code (OCC) index (CS/OCC) AUL activation/release 1 In some embodiments, the FF field may not be needed if a size of the DCI is different from an existing DCI. For some embodiments, the RBA field may not be needed if AUL uses the whole system bandwidth. In some embodiments, the MCS field may not be needed if an MCS of AUL may be determined autonomously and/or the AUL activation and AUL release does not rely on this field. For some embodiments, the TPC command may not be needed if the UE configured with AUL performs power control autonomously (e.g., based on downlink RSRP) and/or the AUL activation and AUL release does not rely on this field. In some embodiments, the CS/OCC field may not be needed, if AUL selects CS/OCC autonomously or CS/OCC are predefined and/or the AUL activation and AUL release does not rely on this field. For some embodiments, AUL activation and/or AUL release may not be needed if the AUL activation and/or AUL release is not indicated explicitly. In some embodiments, criteria in accordance with Table 6 and Table 7 below may be used. Table 6 below may indicate settings for signaling AUL activation via one or more fields. Table 7 below may indicate settings for signaling AUL release via one or more fields. TABLE 6 Settings to signal AUL activation Field Setting TPC command Set to ‘00’ Cyclic shift for DM-RS and OCC index (CS/OCC) Set to ‘000’ Modulation and coding scheme (MCS) MSB is set to ‘0’ TABLE 7 Settings to signal AUL release Field Setting TPC command Set to ‘00’ Cyclic shift for DM-RS and OCC index (CS/OCC) Set to ‘000’ Modulation and coding scheme (MCS) Set to ‘11111’ Note that the settings in Table 6 and Table 7 are exemplary, and that other codings may be used in other embodiments (e.g., other default values for the fields provided above may be used for signaling AUL activation and/or AUL release). Embodiments falling within a second type may employ reuse and/or extension of DCI format 0 for the scheduling of PUSCH with support for SPS. In various LTE embodiments, DCI format 0 may be used for the scheduling of PUSCH in one UL cell (and may have the structure provided in Table 1). In some LTE embodiments, in order to activate an SPS assignment, a first set of settings may be used (e.g., the settings summarized in Table 2), while in order to deactivate an SPS assignment, a second set of settings may be used (e.g., the settings summarized in Table 3). In the context of providing support for AUL activation and/or AUL release, a DCI format 0 may be reused (e.g., as summarized in Table 1). In such embodiments, while the settings in Table 2 and Table 3 may be set to signal AUL activation and/or AUL release, the remaining fields may all be set to some default value (e.g., a value of “0”). An RNTI used for this DCI may be SPS C-RNTI, or may be a newly defined RNTI, for instance an AUL-C-RNTI, which may use reserved RNTI values. In another embodiment, since a DCI format 0 may be also used for G-DCI to carry HARQ ACK/Non-Acknowledgement (NACK), one or more bits of fields which are not used for AUL activation and/or AUL release might be used to distinguish between these two cases. Embodiments falling within a third type may employ reuse of DCI format 0A, DCI format 4A, DCI format 0B, and/or DCI format 4B for the scheduling of PUSCH. DCI format 0A, DCI format 0B, DCI format 4A, and/or DCI format 4B may be used for scheduling PUSCH (e.g., for MulteFire™ and/or for MulteFire™ enhanced Physical Uplink Control Channel (ePUCCH) in a MulteFire™). DCI format 0A and/or DCI format 0B may be used for scheduling of PUSCH (or MulteFire™ ePUCCH) for multiple subframes, while DCI format 4A and/or DCI format 4B may be used with multi-antenna port transmission mode. These DCI may be structured as described in Table 4. In the context of AUL activation and AUL release, DCI format 0A, DCI format 4A, DCI format 0B, and/or DCI format 4B may be reused, for example depending on a multi-antenna port transmission mode and/or a number of scheduled subframes. This may be done by scrambling the DCI via SPS C-RNTI or via a newly defined RNTI, such as via an AUL-C-RNTI, which may use reserved RNTI values. When one of these DCI is used for AUL activation and/or AUL release, one or more of its fields may be set in various ways. In some embodiments, for SPS activation, all fields may be set to a default value (e.g., to values of all “0”, or alternatively another set of values). In some embodiments, for SPS release, one or more fields may be set differently from the above setting. For instance, the coding used in Table 3 may be adopted by setting MCS to a value of “′11111”, or alternatively one or more other fields (e.g., an CS/OCC field) may be also set to values of all “1”. Accordingly, in various exemplary embodiments, merely a subset of the fields in the DCI format 0A, DCI format 0B, DCI format 4A, and/or DCI format 4B may be used for AUL activation and/or AUL release validation. These fields may be set to some default values, while other fields may be used for indication of AUL transmission. For example, resource block assignment may not be used for resource indication if AUL does not use the whole system bandwidth, or CS/OCC may be used to indicate a cyclic shift for Demodulation Reference Signal (DM-RS) and an OCC index, if the UE determines it autonomously, or if CS/OCC is predefined. In some embodiments, DCI format 0A may be adopted for G-DCI to carry HARQ ACK/NACK, one or more bits from fields which might not be directly utilized for AUL activation and AUL release might be used to distinguish between these two cases. In some embodiments, some of the fields may be used and dedicated for AUL activation and AUL release, while other fields could be dedicated to carry HARQ ACK/NACK. Embodiments falling within a fourth type may use a number N of bits in a cPDCCH reserved for these functionalities. In some embodiments, N bits may be used in a cPDCCH, which may serve as a flag and may indicate whether AUL needs to be activated or released. For example, a flag may be assigned a first value (e.g., a value of “0”) to activate AUL, and the flag may be assigned a second value (e.g., a value of “1”) to release AUL (although in some alternate embodiments, the first value may be “1” and the second value may be “0”). For some embodiments, N may be 1, and this may be used to activate AUL or release AUL for one or more UEs configured with AUL (e.g., all UEs). Alternatively, for some embodiments, N may be more than 1, to activate or release AUL for a subset of UEs configured with AUL. For example, for N equal to 2, the bits of a two-bit bitmap may be used to activate/release AUL for two respectively corresponding groups of UEs configured with AUL. The N bits may reuse reserved bits in an existing cPDCCH (and thus the cPDCCH may be the same size as a current cPDCCH). Alternatively, a cPDCCH size may be extended to a larger value with an additional N bits used for the AUL activation and/or AUL release. In some embodiments, a new cPDCCH may be defined to include the information needed for AUL activation and/or AUL release. For some embodiments, the cPDCCH may be scrambled by a current C-RNTI, or a newly defined RNTI can be used. In some embodiments, UEs that are not configured with AUL might not interpret these N bits, and/or might not decode the cPDCCH differently from an existing cPDCCH. Embodiments falling within a fifth type may employ UE-group-specific or cell-specific DCI. For embodiments falling within the first type, the second type, and/or the third type above, a search space (SS) may be either UE-specific (which may be similar to SPS activation and/or SPS release), or may be UE-group specific in order to reduce overhead and a likelihood of collision between UEs configured with AUL. Both for embodiments falling with the fourth type with multiple groups of UEs, and for embodiments falling within the fifth type, a UE group may be semi-statically configured (e.g., via RRC), or may be dynamically configured (e.g., depending on one or more current buffer statuses of each UE). In this context, there may be several ways to define a group or to select UEs. In some embodiments, a UE may be in AUL based on a buffer status report (BSR). For some embodiments, a UE may be in AUL based on location. In some embodiments, a UE may be in AUL based on the Channel State Information (CSI) at the eNB. For some embodiments, a UE may be in AUL based on a random selection among a pool of UEs. In some embodiments, a UE grouping rule may re-use a rule of Transmit Power Control (TPC) configuration as DCI format 3 and/or DCI format 3A, where each bit in a bitmap may corresponds to activation and/or release of one specific user (e.g., UE). For embodiments falling within the first type through the fifth type above, the DCI may include additional information related to the resources where AUL may be performed. For example, a number M of bits may be carried in DCI to indicate an offset after which AUL may be activated. For embodiments falling within the first type through the fifth type above, DCI may include additional information to configure one or more BSR thresholds. In such embodiments, UEs with BSR larger than a BSR threshold may activate AUL transmission, while UEs with BSR smaller than the BSR threshold may wait for an eNB's scheduling. For various embodiments, an eNB may either explicitly configure the BSR threshold via DCI, or may transmit the BSR threshold in RRC signaling and configure an associated bit in DCI. For example, four BSR thresholds may be configured in RRC, where a value of “00” in DCI may correspond with a first BSR threshold, a value of “01” in DCI may correspond with a second BSR threshold, and so on. In various embodiments, different traffic types may have different BSRs (e.g., different BSR thresholds), and an eNB may configure separate BSR (e.g., different BSR thresholds) for different traffic types. Embodiments falling within a sixth type may employ cell-specific SPS RRC. Instead of using one or more Media Access Control (MAC) Control Elements (CEs) to acknowledge the reception of the DCI for SPS activation, no UE-specific MAC UE feedback might be reported. However, one or more UE-specific parameters may be indicated in a UE-specific activation and/or release DCI, and ACK/NACK for the DCI related to AUL release may be carried on the Uplink Control Information (UCI). Various embodiments may also fall within one or more of the types discussed herein. FIG. 2 illustrates an eNB and a UE, in accordance with some embodiments of the disclosure. FIG. 2 includes block diagrams of an eNB 210 and a UE 230 which are operable to co-exist with each other and other elements of an LTE network. High-level, simplified architectures of eNB 210 and UE 230 are described so as not to obscure the embodiments. It should be noted that in some embodiments, eNB 210 may be a stationary non-mobile device. eNB 210 is coupled to one or more antennas 205, and UE 230 is similarly coupled to one or more antennas 225. However, in some embodiments, eNB 210 may incorporate or comprise antennas 205, and UE 230 in various embodiments may incorporate or comprise antennas 225. In some embodiments, antennas 205 and/or antennas 225 may comprise one or more directional or omni-directional antennas, including monopole antennas, dipole antennas, loop antennas, patch antennas, microstrip antennas, coplanar wave antennas, or other types of antennas suitable for transmission of RF signals. In some MIMO (multiple-input and multiple output) embodiments, antennas 205 are separated to take advantage of spatial diversity. eNB 210 and UE 230 are operable to communicate with each other on a network, such as a wireless network. eNB 210 and UE 230 may be in communication with each other over a wireless communication channel 250, which has both a downlink path from eNB 210 to UE 230 and an uplink path from UE 230 to eNB 210. As illustrated in FIG. 2, in some embodiments, eNB 210 may include a physical layer circuitry 212, a MAC (media access control) circuitry 214, a processor 216, a memory 218, and a hardware processing circuitry 220. A person skilled in the art will appreciate that other components not shown may be used in addition to the components shown to form a complete eNB. In some embodiments, physical layer circuitry 212 includes a transceiver 213 for providing signals to and from UE 230. Transceiver 213 provides signals to and from UEs or other devices using one or more antennas 205. In some embodiments, MAC circuitry 214 controls access to the wireless medium. Memory 218 may be, or may include, a storage media/medium such as a magnetic storage media (e.g., magnetic tapes or magnetic disks), an optical storage media (e.g., optical discs), an electronic storage media (e.g., conventional hard disk drives, solid-state disk drives, or flash-memory-based storage media), or any tangible storage media or non-transitory storage media. Hardware processing circuitry 220 may comprise logic devices or circuitry to perform various operations. In some embodiments, processor 216 and memory 218 are arranged to perform the operations of hardware processing circuitry 220, such as operations described herein with reference to logic devices and circuitry within eNB 210 and/or hardware processing circuitry 220. Accordingly, in some embodiments, eNB 210 may be a device comprising an application processor, a memory, one or more antenna ports, and an interface for allowing the application processor to communicate with another device. As is also illustrated in FIG. 2, in some embodiments, UE 230 may include a physical layer circuitry 232, a MAC circuitry 234, a processor 236, a memory 238, a hardware processing circuitry 240, a wireless interface 242, and a display 244. A person skilled in the art would appreciate that other components not shown may be used in addition to the components shown to form a complete UE. In some embodiments, physical layer circuitry 232 includes a transceiver 233 for providing signals to and from eNB 210 (as well as other eNBs). Transceiver 233 provides signals to and from eNBs or other devices using one or more antennas 225. In some embodiments, MAC circuitry 234 controls access to the wireless medium. Memory 238 may be, or may include, a storage media/medium such as a magnetic storage media (e.g., magnetic tapes or magnetic disks), an optical storage media (e.g., optical discs), an electronic storage media (e.g., conventional hard disk drives, solid-state disk drives, or flash-memory-based storage media), or any tangible storage media or non-transitory storage media. Wireless interface 242 may be arranged to allow the processor to communicate with another device. Display 244 may provide a visual and/or tactile display for a user to interact with UE 230, such as a touch-screen display. Hardware processing circuitry 240 may comprise logic devices or circuitry to perform various operations. In some embodiments, processor 236 and memory 238 may be arranged to perform the operations of hardware processing circuitry 240, such as operations described herein with reference to logic devices and circuitry within UE 230 and/or hardware processing circuitry 240. Accordingly, in some embodiments, UE 230 may be a device comprising an application processor, a memory, one or more antennas, a wireless interface for allowing the application processor to communicate with another device, and a touch-screen display. Elements of FIG. 2, and elements of other figures having the same names or reference numbers, can operate or function in the manner described herein with respect to any such figures (although the operation and function of such elements is not limited to such descriptions). For example, FIGS. 3 and 6-7 also depict embodiments of eNBs, hardware processing circuitry of eNBs, UEs, and/or hardware processing circuitry of UEs, and the embodiments described with respect to FIG. 2 and FIGS. 3 and 6-7 can operate or function in the manner described herein with respect to any of the figures. In addition, although eNB 210 and UE 230 are each described as having several separate functional elements, one or more of the functional elements may be combined and may be implemented by combinations of software-configured elements and/or other hardware elements. In some embodiments of this disclosure, the functional elements can refer to one or more processes operating on one or more processing elements. Examples of software and/or hardware configured elements include Digital Signal Processors (DSPs), one or more microprocessors, DSPs, Field-Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Radio-Frequency Integrated Circuits (RFICs), and so on. FIG. 3 illustrates hardware processing circuitries for a UE for SPS activation and/or SPS release, in accordance with some embodiments of the disclosure. With reference to FIG. 2, a UE may include various hardware processing circuitries discussed herein (such as hardware processing circuitry 300 of FIG. 3), which may in turn comprise logic devices and/or circuitry operable to perform various operations. For example, in FIG. 2, UE 230 (or various elements or components therein, such as hardware processing circuitry 240, or combinations of elements or components therein) may include part of, or all of, these hardware processing circuitries. In some embodiments, one or more devices or circuitries within these hardware processing circuitries may be implemented by combinations of software-configured elements and/or other hardware elements. For example, processor 236 (and/or one or more other processors which UE 230 may comprise), memory 238, and/or other elements or components of UE 230 (which may include hardware processing circuitry 240) may be arranged to perform the operations of these hardware processing circuitries, such as operations described herein with reference to devices and circuitry within these hardware processing circuitries. In some embodiments, processor 236 (and/or one or more other processors which UE 230 may comprise) may be a baseband processor. Returning to FIG. 3, an apparatus of UE 230 (or another UE or mobile handset), which may be operable to communicate with one or more eNBs on a wireless network, may comprise hardware processing circuitry 300. In some embodiments, hardware processing circuitry 300 may comprise one or more antenna ports 305 operable to provide various transmissions over a wireless communication channel (such as wireless communication channel 250). Antenna ports 305 may be coupled to one or more antennas 307 (which may be antennas 225). In some embodiments, hardware processing circuitry 300 may incorporate antennas 307, while in other embodiments, hardware processing circuitry 300 may merely be coupled to antennas 307. Antenna ports 305 and antennas 307 may be operable to provide signals from a UE to a wireless communications channel and/or an eNB, and may be operable to provide signals from an eNB and/or a wireless communications channel to a UE. For example, antenna ports 305 and antennas 307 may be operable to provide transmissions from UE 230 to wireless communication channel 250 (and from there to eNB 210, or to another eNB). Similarly, antennas 307 and antenna ports 305 may be operable to provide transmissions from a wireless communication channel 250 (and beyond that, from eNB 210, or another eNB) to UE 230. Hardware processing circuitry 300 may comprise various circuitries operable in accordance with the various embodiments discussed herein. With reference to FIG. 3, hardware processing circuitry 300 may comprise a first circuitry 310 and/or a second circuitry 320. In a variety of embodiments, first circuitry 310 may be operable to process a first DCI format 0A transmission indicating an SPS activation. First circuitry 310 may also be operable to process a second DCI format 0A transmission indicating an SPS release. Second circuitry 320 may be operable to generate one or more UL transmissions for an unlicensed spectrum of the wireless network after the SPS activation and before the SPS release in accordance with a configured schedule. First circuitry 310 may be operable to provide an indicator of SPS activation and/or SPS release to second circuitry 320 via an interface 312. Hardware processing circuitry 300 may also comprise an interface for sending UL transmissions to a transmission circuitry and for receiving DCI format 0A transmissions from a receiving circuitry. In a variety of embodiments, at least one of the first DCI format 0A transmission and the second DCI format 0A transmission may carry a UE-specific parameter. For some embodiments, at least one of the first DCI format 0A transmission and the second DCI format 0A transmission may be scrambled with an AUL-C-RNTI. In some embodiments, the one or more UL transmissions may comprise at least one of: one or more PUSCH transmissions, or one or more Physical Uplink Control Channel (PUCCH) transmissions (e.g., ePUCCH transmissions). For some embodiments, first circuitry 310 may be operable to process a transmission carrying the configured schedule. In some embodiments, the transmission carrying the configured schedule may be a PDCCH transmission. For some embodiments, SPS activation may be indicated by one or more fields of the first DCI format 0A transmission having a first set of values, and SPS release may be indicated by one or more fields of the second DCI format 0A transmission having a second set of values. In some embodiments, SPS activation may be indicated by all bits of the one or more fields of the first DCI format 0A transmission being set to a first value, and SPS release may be indicated by all bits of the one or more fields of the second DCI format 0A transmission being set to a second value. For some embodiments, at least one of the first DCI format 0A transmission and the second DCI format 0A transmission may carry at least one of: a one-bit FF field; a two-bit TPC command field; and a five-bit MCS field. In some embodiments, the first DCI format 0A transmission may carry a two-bit TPC command field having a value of “00” and/or a five-bit MCS field with a most significant bit having a value of “0”. For some embodiments, the second DCI format 0A transmission may carry a two-bit TPC command field having a value of “00” and/or a five-bit MCS field having a value of “11111”. In a variety of embodiments, first circuitry 310 may be operable to process a first DL transmission scrambled with an AUL-C-RNTI, the first DL transmission indicating a SPS activation. First circuitry 310 may also be operable to process a second DL transmission scrambled with the AUL-C-RNTI, the second DL transmission indicating an SPS release. Second circuitry 320 may be operable to generate one or more UL transmissions for an unlicensed spectrum of the wireless network after the SPS activation and before the SPS release in accordance with a configured schedule. First circuitry 310 may be operable to provide an indicator of SPS activation and/or SPS release to second circuitry 320 via an interface 312. Hardware processing circuitry 300 may also comprise an interface for sending UL transmissions to a transmission circuitry and for receiving DL transmissions from a receiving circuitry. In a variety of embodiments, at least one of the first DL transmission and the second DL transmission may carry a UE-specific parameter. For some embodiments, the one or more UL transmissions may comprise at least one of: one or more PUSCH transmissions, or one or more PUCCH transmissions (e.g., ePUCCH transmissions). For some embodiments, first circuitry 310 may be operable to process a DL transmission carrying the configured schedule. In some embodiments, the DL transmission may carry the configured schedule is a PDCCH transmission. For some embodiments, the first DL transmission is a first DCI format 0A transmission, and the second DL transmission is a second DCI format 0A transmission. In some embodiments, SPS activation may be indicated by one or more fields of the first DCI format 0A transmission having a first set of values, and SPS release may be indicated by one or more fields of the second DCI format 0A transmission having a second set of values. For some embodiments, SPS activation may be indicated by all bits of one or more fields of the first DCI format 0A transmission being set to a first value, and SPS release may be indicated by all bits of one or more fields of the second DCI format 0A transmission being set to a second value. In some embodiments, first circuitry 310 and/or second circuitry 320 may be implemented as separate circuitries. In other embodiments, first circuitry 310 and/or second circuitry 320 may be combined and implemented together in a circuitry without altering the essence of the embodiments. FIGS. 4-5 illustrate methods for a UE for SPS activation and/or SPS release, in accordance with some embodiments of the disclosure. With reference to FIG. 2, methods that may relate to UE 230 and hardware processing circuitry 240 are discussed herein. Although the actions in method 400 of FIG. 4 and method 500 of FIG. 5 are shown in a particular order, the order of the actions can be modified. Thus, the illustrated embodiments can be performed in a different order, and some actions may be performed in parallel. Some of the actions and/or operations listed in FIGS. 4-5 are optional in accordance with certain embodiments. The numbering of the actions presented is for the sake of clarity and is not intended to prescribe an order of operations in which the various actions must occur. Additionally, operations from the various flows may be utilized in a variety of combinations. Moreover, in some embodiments, machine readable storage media may have executable instructions that, when executed, cause UE 230 and/or hardware processing circuitry 240 to perform an operation comprising the methods of FIGS. 4-5. Such machine readable storage media may include any of a variety of storage media, like magnetic storage media (e.g., magnetic tapes or magnetic disks), optical storage media (e.g., optical discs), electronic storage media (e.g., conventional hard disk drives, solid-state disk drives, or flash-memory-based storage media), or any other tangible storage media or non-transitory storage media. In some embodiments, an apparatus may comprise means for performing various actions and/or operations of the methods of FIGS. 4-5. Returning to FIG. 4, various methods may be in accordance with the various embodiments discussed herein. A method 400 may comprise a processing 410, a processing 415, and a generating 420. Method 400 may also comprise, for example, a processing 430. In processing 410, a first DCI format 0A transmission indicating an SPS activation may be processed. In processing 415, a second DCI format 0A transmission indicating an SPS release may be processed. In generating 420, one or more UL transmissions for an unlicensed spectrum of the wireless network may be generated after the SPS activation and before the SPS release in accordance with a configured schedule. In some embodiments, at least one of the first DCI format 0A transmission and the second DCI format 0A transmission may carry a UE-specific parameter. For some embodiments, at least one of the first DCI format 0A transmission and the second DCI format 0A transmission may be scrambled with an AUL-C-RNTI. In some embodiments, the one or more UL transmissions may comprise at least one of: one or more PUSCH transmissions, or one or more PUCCH transmissions (e.g., ePUCCH transmissions). In processing 430, a transmission carrying the configured schedule may be processed. In some embodiments, the transmission carrying the configured schedule may be a PDCCH transmission. For some embodiments, SPS activation may be indicated by one or more fields of the first DCI format 0A transmission having a first set of values, and SPS release may be indicated by one or more fields of the second DCI format 0A transmission having a second set of values. In some embodiments, SPS activation may be indicated by all bits of the one or more fields of the first DCI format 0A transmission being set to a first value, and SPS release may be indicated by all bits of the one or more fields of the second DCI format 0A transmission being set to a second value. For some embodiments, at least one of the first DCI format 0A transmission and the second DCI format 0A transmission may carry at least one of: a one-bit FF field; a two-bit TPC command field; and a five-bit MCS field. In some embodiments, the first DCI format 0A transmission may carry a two-bit TPC command field having a value of “00” and/or a five-bit MCS field with a most significant bit having a value of “0”. For some embodiments, the second DCI format 0A transmission may carry a two-bit TPC command field having a value of “00” and/or a five-bit MCS field having a value of “11111”. Returning to FIG. 5, various methods may be in accordance with the various embodiments discussed herein. A method 500 may comprise a processing 510, a processing 515, and a generating 520. Method 500 may also comprise, for example, a processing 530. In processing 510, a first DL transmission scrambled with an AUL-C-RNTI may be processed, the first DL transmission indicating a SPS activation. In processing 515, a second DL transmission scrambled with the AUL-C-RNTI may be processed, the second DL transmission indicating an SPS release. In generating 520, one or more UL transmissions for an unlicensed spectrum of the wireless network may be generated after the SPS activation and before the SPS release in accordance with a configured schedule. In some embodiments, at least one of the first DL transmission and the second DL transmission may carry a UE-specific parameter. For some embodiments, the one or more UL transmissions may comprise at least one of: one or more PUSCH transmissions, or one or more PUCCH transmissions (e.g., ePUCCH transmissions). For some embodiments, in processing 530, a DL transmission carrying the configured schedule may be processed. In some embodiments, the DL transmission may carry the configured schedule is a PDCCH transmission. For some embodiments, the first DL transmission is a first DCI format 0A transmission, and the second DL transmission is a second DCI format 0A transmission. In some embodiments, SPS activation may be indicated by one or more fields of the first DCI format 0A transmission having a first set of values, and SPS release may be indicated by one or more fields of the second DCI format 0A transmission having a second set of values. For some embodiments, SPS activation may be indicated by all bits of one or more fields of the first DCI format 0A transmission being set to a first value, and SPS release may be indicated by all bits of one or more fields of the second DCI format 0A transmission being set to a second value. FIG. 6 illustrates example components of a device, in accordance with some embodiments of the disclosure. In some embodiments, the device 600 may include application circuitry 602, baseband circuitry 604, Radio Frequency (RF) circuitry 606, front-end module (FEM) circuitry 608, one or more antennas 610, and power management circuitry (PMC) 612 coupled together at least as shown. The components of the illustrated device 600 may be included in a UE or a RAN node. In some embodiments, the device 600 may include less elements (e.g., a RAN node may not utilize application circuitry 602, and instead include a processor/controller to process IP data received from an EPC). In some embodiments, the device 600 may include additional elements such as, for example, memory/storage, display, camera, sensor, or input/output (I/O) interface. In other embodiments, the components described below may be included in more than one device (e.g., said circuitries may be separately included in more than one device for Cloud-RAN (C-RAN) implementations). The application circuitry 602 may include one or more application processors. For example, the application circuitry 602 may include circuitry such as, but not limited to, one or more single-core or multi-core processors. The processor(s) may include any combination of general-purpose processors and dedicated processors (e.g., graphics processors, application processors, and so on). The processors may be coupled with or may include memory/storage and may be configured to execute instructions stored in the memory/storage to enable various applications or operating systems to run on the device 600. In some embodiments, processors of application circuitry 602 may process IP data packets received from an EPC. The baseband circuitry 604 may include circuitry such as, but not limited to, one or more single-core or multi-core processors. The baseband circuitry 604 may include one or more baseband processors or control logic to process baseband signals received from a receive signal path of the RF circuitry 606 and to generate baseband signals for a transmit signal path of the RF circuitry 606. Baseband processing circuitry 604 may interface with the application circuitry 602 for generation and processing of the baseband signals and for controlling operations of the RF circuitry 606. For example, in some embodiments, the baseband circuitry 604 may include a third generation (3G) baseband processor 604A, a fourth generation (4G) baseband processor 604B, a fifth generation (5G) baseband processor 604C, or other baseband processor(s) 604D for other existing generations, generations in development or to be developed in the future (e.g., second generation (2G), sixth generation (6G), and so on). The baseband circuitry 604 (e.g., one or more of baseband processors 604A-D) may handle various radio control functions that enable communication with one or more radio networks via the RF circuitry 606. In other embodiments, some or all of the functionality of baseband processors 604A-D may be included in modules stored in the memory 604G and executed via a Central Processing Unit (CPU) 604E. The radio control functions may include, but are not limited to, signal modulation/demodulation, encoding/decoding, radio frequency shifting, and so on. In some embodiments, modulation/demodulation circuitry of the baseband circuitry 604 may include Fast-Fourier Transform (FFT), precoding, or constellation mapping/demapping functionality. In some embodiments, encoding/decoding circuitry of the baseband circuitry 604 may include convolution, tail-biting convolution, turbo, Viterbi, or Low Density Parity Check (LDPC) encoder/decoder functionality. Embodiments of modulation/demodulation and encoder/decoder functionality are not limited to these examples and may include other suitable functionality in other embodiments. In some embodiments, the baseband circuitry 604 may include one or more audio digital signal processor(s) (DSP) 604F. The audio DSP(s) 604F may be include elements for compression/decompression and echo cancellation and may include other suitable processing elements in other embodiments. Components of the baseband circuitry may be suitably combined in a single chip, a single chipset, or disposed on a same circuit board in some embodiments. In some embodiments, some or all of the constituent components of the baseband circuitry 604 and the application circuitry 602 may be implemented together such as, for example, on a system on a chip (SOC). In some embodiments, the baseband circuitry 604 may provide for communication compatible with one or more radio technologies. For example, in some embodiments, the baseband circuitry 604 may support communication with an evolved universal terrestrial radio access network (EUTRAN) or other wireless metropolitan area networks (WMAN), a wireless local area network (WLAN), a wireless personal area network (WPAN). Embodiments in which the baseband circuitry 604 is configured to support radio communications of more than one wireless protocol may be referred to as multi-mode baseband circuitry. RF circuitry 606 may enable communication with wireless networks using modulated electromagnetic radiation through a non-solid medium. In various embodiments, the RF circuitry 606 may include switches, filters, amplifiers, and so on to facilitate the communication with the wireless network. RF circuitry 606 may include a receive signal path which may include circuitry to down-convert RF signals received from the FEM circuitry 608 and provide baseband signals to the baseband circuitry 604. RF circuitry 606 may also include a transmit signal path which may include circuitry to up-convert baseband signals provided by the baseband circuitry 604 and provide RF output signals to the FEM circuitry 608 for transmission. In some embodiments, the receive signal path of the RF circuitry 606 may include mixer circuitry 606A, amplifier circuitry 606B and filter circuitry 606C. In some embodiments, the transmit signal path of the RF circuitry 606 may include filter circuitry 606C and mixer circuitry 606A. RF circuitry 606 may also include synthesizer circuitry 606D for synthesizing a frequency for use by the mixer circuitry 606A of the receive signal path and the transmit signal path. In some embodiments, the mixer circuitry 606A of the receive signal path may be configured to down-convert RF signals received from the FEM circuitry 608 based on the synthesized frequency provided by synthesizer circuitry 606D. The amplifier circuitry 606B may be configured to amplify the down-converted signals and the filter circuitry 606C may be a low-pass filter (LPF) or band-pass filter (BPF) configured to remove unwanted signals from the down-converted signals to generate output baseband signals. Output baseband signals may be provided to the baseband circuitry 604 for further processing. In some embodiments, the output baseband signals may be zero-frequency baseband signals, although this is not a requirement. In some embodiments, mixer circuitry 606A of the receive signal path may comprise passive mixers, although the scope of the embodiments is not limited in this respect. In some embodiments, the mixer circuitry 606A of the transmit signal path may be configured to up-convert input baseband signals based on the synthesized frequency provided by the synthesizer circuitry 606D to generate RF output signals for the FEM circuitry 608. The baseband signals may be provided by the baseband circuitry 604 and may be filtered by filter circuitry 606C. In some embodiments, the mixer circuitry 606A of the receive signal path and the mixer circuitry 606A of the transmit signal path may include two or more mixers and may be arranged for quadrature downconversion and upconversion, respectively. In some embodiments, the mixer circuitry 606A of the receive signal path and the mixer circuitry 606A of the transmit signal path may include two or more mixers and may be arranged for image rejection (e.g., Hartley image rejection). In some embodiments, the mixer circuitry 606A of the receive signal path and the mixer circuitry 606A may be arranged for direct downconversion and direct upconversion, respectively. In some embodiments, the mixer circuitry 606A of the receive signal path and the mixer circuitry 606A of the transmit signal path may be configured for super-heterodyne operation. In some embodiments, the output baseband signals and the input baseband signals may be analog baseband signals, although the scope of the embodiments is not limited in this respect. In some alternate embodiments, the output baseband signals and the input baseband signals may be digital baseband signals. In these alternate embodiments, the RF circuitry 606 may include analog-to-digital converter (ADC) and digital-to-analog converter (DAC) circuitry and the baseband circuitry 604 may include a digital baseband interface to communicate with the RF circuitry 606. In some dual-mode embodiments, a separate radio IC circuitry may be provided for processing signals for each spectrum, although the scope of the embodiments is not limited in this respect. In some embodiments, the synthesizer circuitry 606D may be a fractional-N synthesizer or a fractional N/N+1 synthesizer, although the scope of the embodiments is not limited in this respect as other types of frequency synthesizers may be suitable. For example, synthesizer circuitry 606D may be a delta-sigma synthesizer, a frequency multiplier, or a synthesizer comprising a phase-locked loop with a frequency divider. The synthesizer circuitry 606D may be configured to synthesize an output frequency for use by the mixer circuitry 606A of the RF circuitry 606 based on a frequency input and a divider control input. In some embodiments, the synthesizer circuitry 606D may be a fractional N/N+1 synthesizer. In some embodiments, frequency input may be provided by a voltage controlled oscillator (VCO), although that is not a requirement. Divider control input may be provided by either the baseband circuitry 604 or the applications processor 602 depending on the desired output frequency. In some embodiments, a divider control input (e.g., N) may be determined from a look-up table based on a channel indicated by the applications processor 602. Synthesizer circuitry 606D of the RF circuitry 606 may include a divider, a delay-locked loop ‘L), a multiplexer and a phase accumulator. In some embodiments, the divider may be a dual modulus divider (DMD) and the phase accumulator may be a digital phase accumulator (DPA). In some embodiments, the DMD may be configured to divide the input signal by either N or N+1 (e.g., based on a carry out) to provide a fractional division ratio. In some example embodiments, the DLL may include a set of cascaded, tunable, delay elements, a phase detector, a charge pump and a D-type flip-flop. In these embodiments, the delay elements may be configured to break a VCO period up into Nd equal packets of phase, where Nd is the number of delay elements in the delay line. In this way, the DLL provides negative feedback to help ensure that the total delay through the delay line is one VCO cycle. In some embodiments, synthesizer circuitry 606D may be configured to generate a carrier frequency as the output frequency, while in other embodiments, the output frequency may be a multiple of the carrier frequency (e.g., twice the carrier frequency, four times the carrier frequency) and used in conjunction with quadrature generator and divider circuitry to generate multiple signals at the carrier frequency with multiple different phases with respect to each other. In some embodiments, the output frequency may be a LO frequency (fLO). In some embodiments, the RF circuitry 606 may include an IQ/polar converter. FEM circuitry 608 may include a receive signal path which may include circuitry configured to operate on RF signals received from one or more antennas 610, amplify the received signals and provide the amplified versions of the received signals to the RF circuitry 606 for further processing. FEM circuitry 608 may also include a transmit signal path which may include circuitry configured to amplify signals for transmission provided by the RF circuitry 606 for transmission by one or more of the one or more antennas 610. In various embodiments, the amplification through the transmit or receive signal paths may be done solely in the RF circuitry 606, solely in the FEM 608, or in both the RF circuitry 606 and the FEM 608. In some embodiments, the FEM circuitry 608 may include a TX/RX switch to switch between transmit mode and receive mode operation. The FEM circuitry may include a receive signal path and a transmit signal path. The receive signal path of the FEM circuitry may include an LNA to amplify received RF signals and provide the amplified received RF signals as an output (e.g., to the RF circuitry 606). The transmit signal path of the FEM circuitry 608 may include a power amplifier (PA) to amplify input RF signals (e.g., provided by RF circuitry 606), and one or more filters to generate RF signals for subsequent transmission (e.g., by one or more of the one or more antennas 610). In some embodiments, the PMC 612 may manage power provided to the baseband circuitry 604. In particular, the PMC 612 may control power-source selection, voltage scaling, battery charging, or DC-to-DC conversion. The PMC 612 may often be included when the device 600 is capable of being powered by a battery, for example, when the device is included in a UE. The PMC 612 may increase the power conversion efficiency while providing desirable implementation size and heat dissipation characteristics. While FIG. 6 shows the PMC 612 coupled only with the baseband circuitry 604. However, in other embodiments, the PMC 612 may be additionally or alternatively coupled with, and perform similar power management operations for, other components such as, but not limited to, application circuitry 602, RF circuitry 606, or FEM 608. In some embodiments, the PMC 612 may control, or otherwise be part of, various power saving mechanisms of the device 600. For example, if the device 600 is in an RRC Connected state, where it is still connected to the RAN node as it expects to receive traffic shortly, then it may enter a state known as Discontinuous Reception Mode (DRX) after a period of inactivity. During this state, the device 600 may power down for brief intervals of time and thus save power. If there is no data traffic activity for an extended period of time, then the device 600 may transition off to an RRC Idle state, where it disconnects from the network and does not perform operations such as channel quality feedback, handover, and so on. The device 600 goes into a very low power state and it performs paging where again it periodically wakes up to listen to the network and then powers down again. The device 600 may not receive data in this state, in order to receive data, it must transition back to RRC Connected state. An additional power saving mode may allow a device to be unavailable to the network for periods longer than a paging interval (ranging from seconds to a few hours). During this time, the device is totally unreachable to the network and may power down completely. Any data sent during this time incurs a large delay and it is assumed the delay is acceptable. Processors of the application circuitry 602 and processors of the baseband circuitry 604 may be used to execute elements of one or more instances of a protocol stack. For example, processors of the baseband circuitry 604, alone or in combination, may be used execute Layer 3, Layer 2, or Layer 1 functionality, while processors of the application circuitry 604 may utilize data (e.g., packet data) received from these layers and further execute Layer 4 functionality (e.g., transmission communication protocol (TCP) and user datagram protocol (UDP) layers). As referred to herein, Layer 3 may comprise a radio resource control (RRC) layer, described in further detail below. As referred to herein, Layer 2 may comprise a medium access control (MAC) layer, a radio link control (RLC) layer, and a packet data convergence protocol (PDCP) layer, described in further detail below. As referred to herein, Layer 1 may comprise a physical (PHY) layer of a UE/RAN node, described in further detail below. FIG. 7 illustrates example interfaces of baseband circuitry, in accordance with some embodiments of the disclosure. As discussed above, the baseband circuitry 604 of FIG. 6 may comprise processors 604A-604E and a memory 604G utilized by said processors. Each of the processors 604A-604E may include a memory interface, 704A-704E, respectively, to send/receive data to/from the memory 604G. The baseband circuitry 604 may further include one or more interfaces to communicatively couple to other circuitries/devices, such as a memory interface 712 (e.g., an interface to send/receive data to/from memory external to the baseband circuitry 604), an application circuitry interface 714 (e.g., an interface to send/receive data to/from the application circuitry 602 of FIG. 6), an RF circuitry interface 716 (e.g., an interface to send/receive data to/from RF circuitry 606 of FIG. 6), a wireless hardware connectivity interface 718 (e.g., an interface to send/receive data to/from Near Field Communication (NFC) components, Bluetooth® components (e.g., Bluetooth® Low Energy), Wi-Fi® components, and other communication components), and a power management interface 720 (e.g., an interface to send/receive power or control signals to/from the PMC 612. It is pointed out that elements of any of the Figures herein having the same reference numbers and/or names as elements of any other Figure herein may, in various embodiments, operate or function in a manner similar those elements of the other Figure (without being limited to operating or functioning in such a manner). Reference in the specification to “an embodiment,” “one embodiment,” “some embodiments,” or “other embodiments” means that a particular feature, structure, or characteristic described in connection with the embodiments is included in at least some embodiments, but not necessarily all embodiments. The various appearances of “an embodiment,” “one embodiment,” or “some embodiments” are not necessarily all referring to the same embodiments. If the specification states a component, feature, structure, or characteristic “may,” “might,” or “could” be included, that particular component, feature, structure, or characteristic is not required to be included. If the specification or claim refers to “a” or “an” element, that does not mean there is only one of the elements. If the specification or claims refer to “an additional” element, that does not preclude there being more than one of the additional element. Furthermore, the particular features, structures, functions, or characteristics may be combined in any suitable manner in one or more embodiments. For example, a first embodiment may be combined with a second embodiment anywhere the particular features, structures, functions, or characteristics associated with the two embodiments are not mutually exclusive. While the disclosure has been described in conjunction with specific embodiments thereof, many alternatives, modifications and variations of such embodiments will be apparent to those of ordinary skill in the art in light of the foregoing description. For example, other memory architectures e.g., Dynamic RAM (DRAM) may use the embodiments discussed. The embodiments of the disclosure are intended to embrace all such alternatives, modifications, and variations as to fall within the broad scope of the appended claims. In addition, well known power/ground connections to integrated circuit (IC) chips and other components may or may not be shown within the presented figures, for simplicity of illustration and discussion, and so as not to obscure the disclosure. Further, arrangements may be shown in block diagram form in order to avoid obscuring the disclosure, and also in view of the fact that specifics with respect to implementation of such block diagram arrangements are highly dependent upon the platform within which the present disclosure is to be implemented (i.e., such specifics should be well within purview of one skilled in the art). Where specific details (e.g., circuits) are set forth in order to describe example embodiments of the disclosure, it should be apparent to one skilled in the art that the disclosure can be practiced without, or with variation of, these specific details. The description is thus to be regarded as illustrative instead of limiting. The following examples pertain to further embodiments. Specifics in the examples may be used anywhere in one or more embodiments. All optional features of the apparatus described herein may also be implemented with respect to a method or process. Example 1 provides an apparatus of a User Equipment (UE) operable to communicate with an Evolved Node B (eNB) on a wireless network, comprising: one or more processors to: process a first Downlink Control Information (DCI) format 0A transmission indicating a semi-persistent scheduling (SPS) activation; process a second DCI format 0A transmission indicating an SPS release; and generate one or more Uplink (UL) transmissions for an unlicensed spectrum of the wireless network after the SPS activation and before the SPS release in accordance with a configured schedule, and an interface for sending UL transmissions to a transmission circuitry and for receiving DCI format 0A transmissions from a receiving circuitry. In example 2, the apparatus of example 1, wherein at least one of the first DCI format 0A transmission and the second DCI format 0A transmission carries a UE-specific parameter. In example 3, the apparatus of any of examples 1 through 2, wherein at least one of the first DCI format 0A transmission and the second DCI format 0A transmission is scrambled with an Autonomous Uplink Cell Radio Network Temporary Identifier (AUL-C-RNTI). In example 4, the apparatus of any of examples 1 through 3, wherein the one or more UL transmissions comprise at least one of: one or more Physical Uplink Shared Channel (PUSCH) transmissions, or one or more Physical Uplink Control Channel (PUCCH) transmissions. In example 5, the apparatus of any of examples 1 through 4, wherein the one or more processors are to: process a transmission carrying the configured schedule. In example 6, the apparatus of example 5, wherein the transmission carrying the configured schedule is a Physical Downlink Control Channel (PDCCH) transmission. In example 7, the apparatus of any of examples 1 through 6, wherein SPS activation is indicated by one or more fields of the first DCI format 0A transmission having a first set of values; and wherein SPS release is indicated by one or more fields of the second DCI format 0A transmission having a second set of values. In example 8, the apparatus of example 7, wherein SPS activation is indicated by all bits of the one or more fields of the first DCI format 0A transmission being set to a first value; and wherein SPS release is indicated by all bits of the one or more fields of the second DCI format 0A transmission being set to a second value. In example 9, the apparatus of any of examples 1 through 8, wherein at least one of the first DCI format 0A transmission and the second DCI format 0A transmission carries at least one of: a one-bit Format Flag (FF) field; a two-bit Transmit Power Control (TPC) command field; and a five-bit Modulation and Coding Scheme (MCS) field. In example 10, the apparatus of example 9, wherein the first DCI format 0A transmission carries a two-bit TPC command field having a value of “00”; and wherein the first DCI format 0A transmission carries a five-bit MCS field with a most significant bit having a value of “0”. In example 11, the apparatus of example 9, wherein the second DCI format 0A transmission carries a two-bit TPC command field having a value of “00”; and wherein the second DCI format 0A transmission carries a five-bit MCS field having a value of “11111”. Example 12 provides a User Equipment (UE) device comprising an application processor, a memory, one or more antennas, a wireless interface for allowing the application processor to communicate with another device, and a touch-screen display, the UE device including the apparatus of any of examples 1 through 8. Example 13 provides machine readable storage media having machine executable instructions that, when executed, cause one or more processors of a User Equipment (UE) operable to communicate with an Evolved Node-B (eNB) on a wireless network to perform an operation comprising: process a first Downlink Control Information (DCI) format 0A transmission indicating a semi-persistent scheduling (SPS) activation; process a second DCI format 0A transmission indicating an SPS release; and generate one or more Uplink (UL) transmissions for an unlicensed spectrum of the wireless network after the SPS activation and before the SPS release in accordance with a configured schedule. In example 14, the machine readable storage media of example 13, wherein at least one of the first DCI format 0A transmission and the second DCI format 0A transmission carries a UE-specific parameter. In example 15, the machine readable storage media of any of examples 13 through 14, wherein at least one of the first DCI format 0A transmission and the second DCI format 0A transmission is scrambled with an Autonomous Uplink Cell Radio Network Temporary Identifier (AUL-C-RNTI). In example 16, the machine readable storage media of any of examples 13 through 15, wherein the one or more UL transmissions comprise at least one of: one or more Physical Uplink Shared Channel (PUSCH) transmissions, or one or more Physical Uplink Control Channel (PUCCH) transmissions. In example 17, the machine readable storage media of any of examples 13 through 16, the operation comprising: process a transmission carrying the configured schedule. In example 18, the machine readable storage media of example 17, wherein the transmission carrying the configured schedule is a Physical Downlink Control Channel (PDCCH) transmission. In example 19, the machine readable storage media of any of examples 13 through 18, wherein SPS activation is indicated by one or more fields of the first DCI format 0A transmission having a first set of values; and wherein SPS release is indicated by one or more fields of the second DCI format 0A transmission having a second set of values. In example 20, the machine readable storage media of example 19, wherein SPS activation is indicated by all bits of the one or more fields of the first DCI format 0A transmission being set to a first value; and wherein SPS release is indicated by all bits of the one or more fields of the second DCI format 0A transmission being set to a second value. In example 21, the apparatus of any of examples 13 through 20, wherein at least one of the first DCI format 0A transmission and the second DCI format 0A transmission carries at least one of: a one-bit Format Flag (FF) field; a two-bit Transmit Power Control (TPC) command field; and a five-bit Modulation and Coding Scheme (MCS) field. In example 22, the apparatus of example 21, wherein the first DCI format 0A transmission carries a two-bit TPC command field having a value of “00”; and wherein the first DCI format 0A transmission carries a five-bit MCS field with a most significant bit having a value of “0”. In example 23, the apparatus of example 21, wherein the second DCI format 0A transmission carries a two-bit TPC command field having a value of “00”; and wherein the second DCI format 0A transmission carries a five-bit MCS field having a value of “11111”. Example 24 provides an apparatus of a User Equipment (UE) operable to communicate with an Evolved Node B (eNB) on a wireless network, comprising: one or more processors to: process a first Downlink (DL) transmission scrambled with an Autonomous Uplink Cell Radio Network Temporary Identifier (AUL-C-RNTI), the first DL transmission indicating a semi-persistent scheduling (SPS) activation; process a second DL transmission scrambled with the AUL-C-RNTI, the second DL transmission indicating an SPS release; and generate one or more Uplink (UL) transmissions for an unlicensed spectrum of the wireless network after the SPS activation and before the SPS release in accordance with a configured schedule, and an interface for sending UL transmissions to a transmission circuitry and for receiving DL transmissions from a receiving circuitry. In example 25, the apparatus of example 24, wherein at least one of the first DL transmission and the second DL transmission carries a UE-specific parameter. In example 26, the apparatus of any of examples 24 through 25, wherein the one or more UL transmissions comprise at least one of: one or more Physical Uplink Shared Channel (PUSCH) transmissions, or one or more Physical Uplink Control Channel (PUCCH) transmissions. In example 27, the apparatus of any of examples 24 through 26, wherein the one or more processors are to: process a DL transmission carrying the configured schedule. In example 28, the apparatus of example 27, wherein the DL transmission carrying the configured schedule is a Physical Downlink Control Channel (PDCCH) transmission. In example 29, the apparatus of any of examples 24 through 28, wherein the first DL transmission is a first Downlink Control Information (DCI) format 0A transmission; and wherein the second DL transmission is a second DCI format 0A transmission. In example 30, the apparatus of example 29, wherein SPS activation is indicated by one or more fields of the first DCI format 0A transmission having a first set of values; and wherein SPS release is indicated by one or more fields of the second DCI format 0A transmission having a second set of values. In example 31, the apparatus of example 30, wherein SPS activation is indicated by all bits of one or more fields of the first DCI format 0A transmission being set to a first value; and wherein SPS release is indicated by all bits of one or more fields of the second DCI format 0A transmission being set to a second value. Example 32 provides machine readable storage media having machine executable instructions that, when executed, cause one or more processors of a User Equipment (UE) operable to communicate with an Evolved Node-B (eNB) on a wireless network to perform an operation comprising: process a first Downlink (DL) transmission scrambled with an Autonomous Uplink Cell Radio Network Temporary Identifier (AUL-C-RNTI), the first DL transmission indicating a semi-persistent scheduling (SPS) activation; process a second DL transmission scrambled with the AUL-C-RNTI, the second DL transmission indicating an SPS release; and generate one or more Uplink (UL) transmissions for an unlicensed spectrum of the wireless network after the SPS activation and before the SPS release in accordance with a configured schedule. In example 33, the machine readable storage media of example 32, wherein at least one of the first DL transmission and the second DL transmission carries a UE-specific parameter. In example 34, the machine readable storage media of any of examples 32 through 33, wherein the one or more UL transmissions comprise at least one of: one or more Physical Uplink Shared Channel (PUSCH) transmissions, or one or more Physical Uplink Control Channel (PUCCH) transmissions. In example 35, the machine readable storage media of any of examples 32 through 34, the operation comprising: process a DL transmission carrying the configured schedule. In example 36, the machine readable storage media of example 35, wherein the DL transmission carrying the configured schedule is a Physical Downlink Control Channel (PDCCH) transmission. In example 37, the machine readable storage media of any of examples 32 through 36, wherein the first DL transmission is a first Downlink Control Information (DCI) format 0A transmission; and wherein the second DL transmission is a second DCI format 0A transmission. In example 38, the machine readable storage media of example 37, wherein SPS activation is indicated by one or more fields of the first DCI format 0A transmission having a first set of values; and wherein SPS release is indicated by one or more fields of the second DCI format 0A transmission having a second set of values. In example 39, the machine readable storage media of example 38, wherein SPS activation is indicated by all bits of one or more fields of the first DCI format 0A transmission being set to a first value; and wherein SPS release is indicated by all bits of one or more fields of the second DCI format 0A transmission being set to a second value. In example 40, the apparatus of any of examples 1 through 8, and 24 through 31, wherein the one or more processors comprise a baseband processor. In example 41, the apparatus of any of examples 1 through 8, and 24 through 31, comprising a memory for storing instructions, the memory being coupled to the one or more processors. In example 42, the apparatus of any of examples 1 through 8, and 24 through 31, comprising a transceiver circuitry for at least one of: generating transmissions, encoding transmissions, processing transmissions, or decoding transmissions. In example 43, the apparatus of any of examples 1 through 8, and 24 through 31, comprising a transceiver circuitry for generating transmissions and processing transmissions. An abstract is provided that will allow the reader to ascertain the nature and gist of the technical disclosure. The abstract is submitted with the understanding that it will not be used to limit the scope or meaning of the claims. The following claims are hereby incorporated into the detailed description, with each claim standing on its own as a separate embodiment. 16995693 apple inc. USA B2 Utility Patent Grant (with pre-grant publication) issued on or after January 2, 2001. Open Apr 27th, 2022 08:45AM Apr 27th, 2022 08:45AM Technology Technology Hardware & Equipment Information Technology

Request a demo to view additional historical data, and much more.

Make fast
queries

No longer rely on engineers to access data; build and share queries with intuitive tools, and derive insights in real time.

Bookmark queries with your team

Save and share custom queries using a private bookmarking feature, accessible to your whole team.

Be first to know with alerts

Scan data points across millions of companies and receive emails when relevant metrics cross critical thresholds.

Visualize data for quick insights

Create custom keyword-based word clouds, charts, and advanced visualizations to quickly analyze the data.

Map competitor locations

Analyze competitor presence and growth over time by overlaying store locations with the Nearby Competitor feature.

Add widgets to your dashboards

Access existing maps, charts, word clouds, and other visualizations to understand your data quickly. Or build custom widgets to view data just the way you want it.